Dec 02 07:23:07 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 07:23:07 crc restorecon[4758]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 07:23:07 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:23:08 crc restorecon[4758]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 07:23:08 crc kubenswrapper[4895]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.941868 4895 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945129 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945147 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945152 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945157 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945161 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945166 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945170 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945174 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945178 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945183 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945187 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945192 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945197 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945202 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945206 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945211 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945218 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945223 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945226 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945231 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945235 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945239 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945243 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945246 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945250 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945254 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945257 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945261 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945264 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945268 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945272 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945275 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945279 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945283 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945287 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945291 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945295 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945299 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945302 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945306 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945309 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945313 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945317 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945321 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945325 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945328 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945332 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945336 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945344 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945348 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945352 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945355 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945359 4895 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945362 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945366 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945370 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945374 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945378 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945383 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945388 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945394 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945398 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945402 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945406 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945410 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945415 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945418 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945423 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945433 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945439 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.945444 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945528 4895 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945537 4895 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945546 4895 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945551 4895 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945557 4895 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945561 4895 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945567 4895 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945574 4895 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945579 4895 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945583 4895 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945588 4895 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945592 4895 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945597 4895 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945601 4895 flags.go:64] FLAG: --cgroup-root="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945605 4895 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945610 4895 flags.go:64] FLAG: --client-ca-file="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945614 4895 flags.go:64] FLAG: --cloud-config="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945618 4895 flags.go:64] FLAG: --cloud-provider="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945622 4895 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945630 4895 flags.go:64] FLAG: --cluster-domain="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945634 4895 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945638 4895 flags.go:64] FLAG: --config-dir="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945642 4895 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945647 4895 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945653 4895 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945657 4895 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945662 4895 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945666 4895 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945670 4895 flags.go:64] FLAG: --contention-profiling="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945674 4895 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945679 4895 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945683 4895 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945687 4895 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945694 4895 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945698 4895 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945702 4895 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945706 4895 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945710 4895 flags.go:64] FLAG: --enable-server="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945714 4895 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945719 4895 flags.go:64] FLAG: --event-burst="100" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945724 4895 flags.go:64] FLAG: --event-qps="50" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945728 4895 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945731 4895 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945748 4895 flags.go:64] FLAG: --eviction-hard="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945754 4895 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945758 4895 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945762 4895 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945766 4895 flags.go:64] FLAG: --eviction-soft="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945770 4895 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945774 4895 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945778 4895 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945782 4895 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945786 4895 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945791 4895 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945795 4895 flags.go:64] FLAG: --feature-gates="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945800 4895 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945806 4895 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945810 4895 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945816 4895 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945822 4895 flags.go:64] FLAG: --healthz-port="10248" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945827 4895 flags.go:64] FLAG: --help="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945831 4895 flags.go:64] FLAG: --hostname-override="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945836 4895 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945841 4895 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945846 4895 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945851 4895 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945856 4895 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945860 4895 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945864 4895 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945868 4895 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945872 4895 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945876 4895 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945880 4895 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945884 4895 flags.go:64] FLAG: --kube-reserved="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945889 4895 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945893 4895 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945897 4895 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945901 4895 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945905 4895 flags.go:64] FLAG: --lock-file="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945909 4895 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945914 4895 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945918 4895 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945924 4895 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945928 4895 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945933 4895 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945938 4895 flags.go:64] FLAG: --logging-format="text" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945942 4895 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945946 4895 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945950 4895 flags.go:64] FLAG: --manifest-url="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945960 4895 flags.go:64] FLAG: --manifest-url-header="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945966 4895 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945970 4895 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945975 4895 flags.go:64] FLAG: --max-pods="110" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945979 4895 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945983 4895 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945987 4895 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945992 4895 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.945996 4895 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946000 4895 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946004 4895 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946013 4895 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946017 4895 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946022 4895 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946026 4895 flags.go:64] FLAG: --pod-cidr="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946030 4895 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946038 4895 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946042 4895 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946046 4895 flags.go:64] FLAG: --pods-per-core="0" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946050 4895 flags.go:64] FLAG: --port="10250" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946054 4895 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946059 4895 flags.go:64] FLAG: --provider-id="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946063 4895 flags.go:64] FLAG: --qos-reserved="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946067 4895 flags.go:64] FLAG: --read-only-port="10255" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946071 4895 flags.go:64] FLAG: --register-node="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946075 4895 flags.go:64] FLAG: --register-schedulable="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946079 4895 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946091 4895 flags.go:64] FLAG: --registry-burst="10" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946095 4895 flags.go:64] FLAG: --registry-qps="5" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946099 4895 flags.go:64] FLAG: --reserved-cpus="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946103 4895 flags.go:64] FLAG: --reserved-memory="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946108 4895 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946112 4895 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946119 4895 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946123 4895 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946127 4895 flags.go:64] FLAG: --runonce="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946131 4895 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946135 4895 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946139 4895 flags.go:64] FLAG: --seccomp-default="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946143 4895 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946147 4895 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946151 4895 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946155 4895 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946160 4895 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946164 4895 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946168 4895 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946172 4895 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946176 4895 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946180 4895 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946185 4895 flags.go:64] FLAG: --system-cgroups="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946189 4895 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946195 4895 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946199 4895 flags.go:64] FLAG: --tls-cert-file="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946204 4895 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946208 4895 flags.go:64] FLAG: --tls-min-version="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946213 4895 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946217 4895 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946221 4895 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946225 4895 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946232 4895 flags.go:64] FLAG: --v="2" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946238 4895 flags.go:64] FLAG: --version="false" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946244 4895 flags.go:64] FLAG: --vmodule="" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946249 4895 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946254 4895 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946368 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946375 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946380 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946385 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946389 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946394 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946398 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946402 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946406 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946410 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946413 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946417 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946421 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946426 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946429 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946433 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946437 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946441 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946445 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946449 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946453 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946458 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946462 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946467 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946471 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946475 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946479 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946486 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946490 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946494 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946499 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946503 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946507 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946514 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946518 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946523 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946528 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946533 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946538 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946544 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946548 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946553 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946558 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946562 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946566 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946570 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946574 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946579 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946583 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946586 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946590 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946595 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946599 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946604 4895 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946608 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946612 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946616 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946627 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946631 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946636 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946640 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946644 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946648 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946652 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946656 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946662 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946667 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946672 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946677 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946681 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.946685 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.946699 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.956687 4895 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.956766 4895 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956914 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956938 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956949 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956957 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956965 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956974 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956980 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956985 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956992 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.956998 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957003 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957009 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957016 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957022 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957028 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957033 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957039 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957044 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957050 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957056 4895 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957061 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957067 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957072 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957077 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957082 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957087 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957092 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957099 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957104 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957110 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957115 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957121 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957127 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957132 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957139 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957144 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957148 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957155 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957160 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957165 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957170 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957174 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957179 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957185 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957191 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957196 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957202 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957208 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957214 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957219 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957225 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957230 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957235 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957240 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957245 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957250 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957254 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957259 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957264 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957268 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957273 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957278 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957283 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957288 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957293 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957299 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957304 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957308 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957313 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957319 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957325 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.957334 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957537 4895 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957546 4895 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957551 4895 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957556 4895 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957562 4895 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957569 4895 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957576 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957582 4895 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957588 4895 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957594 4895 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957600 4895 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957605 4895 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957611 4895 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957616 4895 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957621 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957625 4895 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957630 4895 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957635 4895 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957640 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957645 4895 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957650 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957656 4895 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957661 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957666 4895 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957672 4895 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957677 4895 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957681 4895 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957686 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957691 4895 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957697 4895 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957703 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957707 4895 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957712 4895 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957717 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957723 4895 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957729 4895 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957736 4895 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957761 4895 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957767 4895 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957773 4895 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957780 4895 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957786 4895 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957792 4895 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957800 4895 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957806 4895 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957812 4895 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957819 4895 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957825 4895 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957830 4895 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957835 4895 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957840 4895 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957845 4895 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957849 4895 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957854 4895 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957859 4895 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957864 4895 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957869 4895 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957874 4895 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957879 4895 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957885 4895 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957891 4895 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957897 4895 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957905 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957911 4895 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957916 4895 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957923 4895 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957930 4895 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957935 4895 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957940 4895 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957946 4895 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:23:08 crc kubenswrapper[4895]: W1202 07:23:08.957952 4895 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.957960 4895 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.958537 4895 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.962691 4895 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.962884 4895 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.963676 4895 server.go:997] "Starting client certificate rotation" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.963722 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.965124 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 18:45:47.031781219 +0000 UTC Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.965289 4895 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 587h22m38.066497164s for next certificate rotation Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.972829 4895 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.975569 4895 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:23:08 crc kubenswrapper[4895]: I1202 07:23:08.987009 4895 log.go:25] "Validated CRI v1 runtime API" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.005718 4895 log.go:25] "Validated CRI v1 image API" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.008169 4895 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.011155 4895 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-07-18-37-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.011210 4895 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.033496 4895 manager.go:217] Machine: {Timestamp:2025-12-02 07:23:09.031652472 +0000 UTC m=+0.202512125 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:92c636b1-dcb0-457f-b098-73baeaac297e BootID:42683c5b-b2bf-439b-8ee4-25c8d72cfed1 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2d:35:aa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2d:35:aa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3d:e9:49 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cc:cb:e0 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a8:82:48 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:56:95:87 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:52:05:39 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:a8:20:32:26:f6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:09:f8:fc:48:6e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.033936 4895 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.034234 4895 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.035380 4895 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.035832 4895 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.035917 4895 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.036289 4895 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.036312 4895 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.036636 4895 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.036705 4895 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.037089 4895 state_mem.go:36] "Initialized new in-memory state store" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.037234 4895 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.038520 4895 kubelet.go:418] "Attempting to sync node with API server" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.038565 4895 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.038617 4895 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.038647 4895 kubelet.go:324] "Adding apiserver pod source" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.038675 4895 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.041709 4895 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.042376 4895 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.043279 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.043295 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.043445 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.043450 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.044139 4895 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045033 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045174 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045367 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045445 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045520 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045623 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045768 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045836 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045865 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045886 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045917 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.045939 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.046622 4895 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.047716 4895 server.go:1280] "Started kubelet" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.047950 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.048310 4895 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.048314 4895 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.050261 4895 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.049936 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d5517a57521f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:23:09.04762828 +0000 UTC m=+0.218487973,LastTimestamp:2025-12-02 07:23:09.04762828 +0000 UTC m=+0.218487973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 07:23:09 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.052857 4895 server.go:460] "Adding debug handlers to kubelet server" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.053450 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.054176 4895 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.054436 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:46:42.291686771 +0000 UTC Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.054543 4895 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 911h23m33.237152108s for next certificate rotation Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.054876 4895 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.054907 4895 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.055110 4895 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.055468 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.055697 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.056634 4895 factory.go:153] Registering CRI-O factory Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.056685 4895 factory.go:221] Registration of the crio container factory successfully Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.057213 4895 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.057238 4895 factory.go:55] Registering systemd factory Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.057252 4895 factory.go:221] Registration of the systemd container factory successfully Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.057284 4895 factory.go:103] Registering Raw factory Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.057308 4895 manager.go:1196] Started watching for new ooms in manager Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.059285 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.059425 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.060818 4895 manager.go:319] Starting recovery of all containers Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.068888 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.068965 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.068995 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069018 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069038 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069056 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069072 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069087 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069106 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069123 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069138 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069154 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069174 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069198 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069219 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069241 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069263 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069280 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069307 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069322 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069336 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069351 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069367 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069383 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069399 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069414 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069518 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069540 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069556 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069572 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069588 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069605 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069652 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069668 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069683 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069698 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069715 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069734 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069795 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069837 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069855 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069870 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069886 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.069902 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070101 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070118 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070133 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070147 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070163 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070180 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070204 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070221 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070250 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070268 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070282 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070298 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070315 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070330 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070353 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070368 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070382 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070398 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070412 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070425 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070441 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070455 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070468 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070482 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070495 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070509 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070528 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070545 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070567 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070581 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070595 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070610 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070625 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070639 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070655 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070708 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070723 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070736 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070770 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070784 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070797 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070835 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070851 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070864 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070878 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070891 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070904 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070917 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070930 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070943 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070958 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070970 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070983 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.070996 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071012 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071026 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071038 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071051 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071067 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071082 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071106 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071120 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071135 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071149 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071196 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071218 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071233 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071249 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071264 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071277 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071291 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071306 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071318 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071330 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071343 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071357 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071370 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071385 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071398 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071414 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071427 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071442 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071456 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071469 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071483 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071499 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071513 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071526 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071539 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071553 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071567 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071579 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071596 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071610 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071622 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071636 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071650 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071664 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071678 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071692 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071704 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071716 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071730 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071766 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071779 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071793 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071806 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071819 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071832 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071849 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071861 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071874 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071887 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071900 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071920 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071934 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071948 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071963 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071976 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.071992 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072004 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072021 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072035 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072048 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072065 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072082 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072098 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072113 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072129 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072143 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072156 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072171 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072189 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072207 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072223 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072235 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072249 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072262 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072275 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072288 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072303 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072316 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072329 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072343 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072357 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.072371 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073193 4895 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073241 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073290 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073314 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073329 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073342 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073357 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073392 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073407 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073419 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073435 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073450 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073464 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073479 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073495 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073508 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073522 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073535 4895 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073547 4895 reconstruct.go:97] "Volume reconstruction finished" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.073557 4895 reconciler.go:26] "Reconciler: start to sync state" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.124013 4895 manager.go:324] Recovery completed Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.135184 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.136449 4895 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.137673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.137720 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.137730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139141 4895 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139174 4895 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139204 4895 state_mem.go:36] "Initialized new in-memory state store" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139667 4895 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139759 4895 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.139807 4895 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.139892 4895 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.142867 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.142958 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.151185 4895 policy_none.go:49] "None policy: Start" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.153703 4895 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.153892 4895 state_mem.go:35] "Initializing new in-memory state store" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.156318 4895 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.208903 4895 manager.go:334] "Starting Device Plugin manager" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209028 4895 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209042 4895 server.go:79] "Starting device plugin registration server" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209444 4895 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209458 4895 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209775 4895 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209858 4895 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.209872 4895 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.220196 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.240232 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.240529 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.241724 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.241820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.241837 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.242028 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.242287 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.242381 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243099 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243113 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243385 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243462 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243720 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243922 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.243990 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.244646 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.244675 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.244687 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.244847 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.244980 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245071 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245251 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245499 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245532 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245688 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245825 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245864 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.245948 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246535 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246577 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246604 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246657 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.246669 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.247349 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.247377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.247388 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.256568 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.275649 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.275799 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.275886 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.275975 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276050 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276123 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276189 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276350 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276484 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276565 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276755 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276912 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.276993 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.309908 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.311467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.311541 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.311559 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.311617 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.312470 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378607 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378637 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378666 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378697 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378766 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378794 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378821 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378842 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378894 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378944 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.378969 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379164 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379268 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379275 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379351 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379436 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379506 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379551 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379554 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379707 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.379657 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.513226 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.514857 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.514936 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.514956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.514999 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.515772 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.570670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.574082 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.594065 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.603367 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f75752e48f4c0799d5076b3ba6f5b483736dcc0d5851d52332ed0fcbb0c1eb38 WatchSource:0}: Error finding container f75752e48f4c0799d5076b3ba6f5b483736dcc0d5851d52332ed0fcbb0c1eb38: Status 404 returned error can't find the container with id f75752e48f4c0799d5076b3ba6f5b483736dcc0d5851d52332ed0fcbb0c1eb38 Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.605204 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-441505e1e7722da61b0b3ea63025f853fc039041deaf68e242e0cbcee7f8b85e WatchSource:0}: Error finding container 441505e1e7722da61b0b3ea63025f853fc039041deaf68e242e0cbcee7f8b85e: Status 404 returned error can't find the container with id 441505e1e7722da61b0b3ea63025f853fc039041deaf68e242e0cbcee7f8b85e Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.609175 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.614460 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b524008e645f32f68a7535f93da8ff0e51ac48a80e9647d1bd853c5035010070 WatchSource:0}: Error finding container b524008e645f32f68a7535f93da8ff0e51ac48a80e9647d1bd853c5035010070: Status 404 returned error can't find the container with id b524008e645f32f68a7535f93da8ff0e51ac48a80e9647d1bd853c5035010070 Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.615962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.635294 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c4949d845efd289d233f25dd4471a95b5fde1d3fb487cc54166f08ccd026ff66 WatchSource:0}: Error finding container c4949d845efd289d233f25dd4471a95b5fde1d3fb487cc54166f08ccd026ff66: Status 404 returned error can't find the container with id c4949d845efd289d233f25dd4471a95b5fde1d3fb487cc54166f08ccd026ff66 Dec 02 07:23:09 crc kubenswrapper[4895]: W1202 07:23:09.641855 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1986762b40f64e42c336ca911f42d8ccb30aa1d7323c1f177d9746257dce366d WatchSource:0}: Error finding container 1986762b40f64e42c336ca911f42d8ccb30aa1d7323c1f177d9746257dce366d: Status 404 returned error can't find the container with id 1986762b40f64e42c336ca911f42d8ccb30aa1d7323c1f177d9746257dce366d Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.657709 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.916163 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.917956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.917993 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.918003 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:09 crc kubenswrapper[4895]: I1202 07:23:09.918038 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:09 crc kubenswrapper[4895]: E1202 07:23:09.918542 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.049326 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:10 crc kubenswrapper[4895]: W1202 07:23:10.120305 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:10 crc kubenswrapper[4895]: E1202 07:23:10.120430 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:10 crc kubenswrapper[4895]: W1202 07:23:10.143416 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:10 crc kubenswrapper[4895]: E1202 07:23:10.143478 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.151559 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783" exitCode=0 Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.151668 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.151817 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b524008e645f32f68a7535f93da8ff0e51ac48a80e9647d1bd853c5035010070"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.151981 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.153881 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.153939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.153958 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.155153 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef" exitCode=0 Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.155273 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.155475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f75752e48f4c0799d5076b3ba6f5b483736dcc0d5851d52332ed0fcbb0c1eb38"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.155620 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.156853 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157375 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157414 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157428 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157890 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.157909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.158602 4895 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e" exitCode=0 Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.158643 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.158722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"441505e1e7722da61b0b3ea63025f853fc039041deaf68e242e0cbcee7f8b85e"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.158870 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.159990 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.160025 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.160039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.161606 4895 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8" exitCode=0 Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.161712 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.161788 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1986762b40f64e42c336ca911f42d8ccb30aa1d7323c1f177d9746257dce366d"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.161913 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.162989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.163025 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.163038 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.164403 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d"} Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.164448 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c4949d845efd289d233f25dd4471a95b5fde1d3fb487cc54166f08ccd026ff66"} Dec 02 07:23:10 crc kubenswrapper[4895]: W1202 07:23:10.202553 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:10 crc kubenswrapper[4895]: E1202 07:23:10.202719 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:10 crc kubenswrapper[4895]: E1202 07:23:10.459511 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Dec 02 07:23:10 crc kubenswrapper[4895]: W1202 07:23:10.541031 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 02 07:23:10 crc kubenswrapper[4895]: E1202 07:23:10.541119 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.719635 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.723939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.723996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.724007 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:10 crc kubenswrapper[4895]: I1202 07:23:10.724045 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.169215 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.169260 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.169269 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.169278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.170373 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec" exitCode=0 Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.170423 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.170505 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.171289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.171314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.171323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.171713 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.171914 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.173548 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.173573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.173582 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.174634 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.174687 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.174710 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.174866 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.176036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.176100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.176126 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.177094 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.177151 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.177181 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8"} Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.177242 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.178224 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.178272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:11 crc kubenswrapper[4895]: I1202 07:23:11.178285 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.190685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba"} Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.190877 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.193091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.193238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.193338 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.195029 4895 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb" exitCode=0 Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.195161 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb"} Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.195187 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.195215 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.197061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.197160 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.197235 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.197921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.197989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.198013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:12 crc kubenswrapper[4895]: I1202 07:23:12.771076 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202789 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d46786f8b9f8e55028793082aeeac5223ae936eadec4f863779607746386dafb"} Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202856 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"443e5217ebe699e87855da6368830efb6c0cfd2269f7e478b0eba4228c7332d0"} Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202887 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4aecebf2f414351a8a75de9a970c8cc3c71debd541c3a14b2c0bc47c1f1bd68d"} Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202905 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f3094ac339ea29ea82679832060bdab28766b6776f271b79d7d187ccf42144e1"} Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202937 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.202987 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.203051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204079 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204079 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204168 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.204214 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:13 crc kubenswrapper[4895]: I1202 07:23:13.488874 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.009659 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.210706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"033075acf80c00c23e81564e5384b1179d58a79ed6786abc63b792cb07d7a7e2"} Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.210841 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.210861 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.210961 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212185 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212208 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212365 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212525 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:14 crc kubenswrapper[4895]: I1202 07:23:14.212565 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.035899 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.213948 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.215518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.215602 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.215615 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.217983 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.219162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.219210 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.219225 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:15 crc kubenswrapper[4895]: I1202 07:23:15.436226 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.148689 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.217097 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.217164 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.218564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.218626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.218644 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.218938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.219016 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.219045 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.696782 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.697042 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.699218 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.699262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.699272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:16 crc kubenswrapper[4895]: I1202 07:23:16.701878 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.010250 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.010348 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.219536 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.219665 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.220901 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.221075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.221111 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.221226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.221122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.221296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.994777 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.995013 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.996437 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.996519 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:17 crc kubenswrapper[4895]: I1202 07:23:17.996542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:18 crc kubenswrapper[4895]: I1202 07:23:18.794014 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:18 crc kubenswrapper[4895]: I1202 07:23:18.795071 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:18 crc kubenswrapper[4895]: I1202 07:23:18.796488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:18 crc kubenswrapper[4895]: I1202 07:23:18.796668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:18 crc kubenswrapper[4895]: I1202 07:23:18.796834 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:19 crc kubenswrapper[4895]: E1202 07:23:19.220441 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 07:23:20 crc kubenswrapper[4895]: E1202 07:23:20.725537 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 02 07:23:21 crc kubenswrapper[4895]: I1202 07:23:21.049600 4895 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:23:21 crc kubenswrapper[4895]: I1202 07:23:21.139492 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 07:23:21 crc kubenswrapper[4895]: I1202 07:23:21.140093 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 07:23:22 crc kubenswrapper[4895]: E1202 07:23:22.061280 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.325991 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.327941 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.328013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.328036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.328076 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:22 crc kubenswrapper[4895]: E1202 07:23:22.391499 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187d5517a57521f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:23:09.04762828 +0000 UTC m=+0.218487973,LastTimestamp:2025-12-02 07:23:09.04762828 +0000 UTC m=+0.218487973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 07:23:22 crc kubenswrapper[4895]: W1202 07:23:22.678897 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.679048 4895 trace.go:236] Trace[688683793]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:23:12.676) (total time: 10002ms): Dec 02 07:23:22 crc kubenswrapper[4895]: Trace[688683793]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (07:23:22.678) Dec 02 07:23:22 crc kubenswrapper[4895]: Trace[688683793]: [10.002130117s] [10.002130117s] END Dec 02 07:23:22 crc kubenswrapper[4895]: E1202 07:23:22.679088 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 07:23:22 crc kubenswrapper[4895]: W1202 07:23:22.690977 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:23:22 crc kubenswrapper[4895]: I1202 07:23:22.691132 4895 trace.go:236] Trace[1325860691]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:23:12.688) (total time: 10002ms): Dec 02 07:23:22 crc kubenswrapper[4895]: Trace[1325860691]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (07:23:22.690) Dec 02 07:23:22 crc kubenswrapper[4895]: Trace[1325860691]: [10.00229705s] [10.00229705s] END Dec 02 07:23:22 crc kubenswrapper[4895]: E1202 07:23:22.691170 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 07:23:23 crc kubenswrapper[4895]: W1202 07:23:23.072430 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:23:23 crc kubenswrapper[4895]: I1202 07:23:23.072627 4895 trace.go:236] Trace[1784394694]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:23:13.070) (total time: 10001ms): Dec 02 07:23:23 crc kubenswrapper[4895]: Trace[1784394694]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:23:23.072) Dec 02 07:23:23 crc kubenswrapper[4895]: Trace[1784394694]: [10.001797154s] [10.001797154s] END Dec 02 07:23:23 crc kubenswrapper[4895]: E1202 07:23:23.072673 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 07:23:23 crc kubenswrapper[4895]: W1202 07:23:23.279487 4895 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:23:23 crc kubenswrapper[4895]: I1202 07:23:23.279653 4895 trace.go:236] Trace[1884751726]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:23:13.278) (total time: 10001ms): Dec 02 07:23:23 crc kubenswrapper[4895]: Trace[1884751726]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:23:23.279) Dec 02 07:23:23 crc kubenswrapper[4895]: Trace[1884751726]: [10.001242268s] [10.001242268s] END Dec 02 07:23:23 crc kubenswrapper[4895]: E1202 07:23:23.279691 4895 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 07:23:23 crc kubenswrapper[4895]: I1202 07:23:23.489244 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:23:23 crc kubenswrapper[4895]: I1202 07:23:23.489378 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.475352 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.475602 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.477435 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.477486 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.477503 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:25 crc kubenswrapper[4895]: I1202 07:23:25.495575 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 07:23:26 crc kubenswrapper[4895]: I1202 07:23:26.246043 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:26 crc kubenswrapper[4895]: I1202 07:23:26.247403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:26 crc kubenswrapper[4895]: I1202 07:23:26.247535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:26 crc kubenswrapper[4895]: I1202 07:23:26.247585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:27 crc kubenswrapper[4895]: I1202 07:23:27.010676 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:23:27 crc kubenswrapper[4895]: I1202 07:23:27.010782 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:23:27 crc kubenswrapper[4895]: I1202 07:23:27.846377 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 07:23:27 crc kubenswrapper[4895]: I1202 07:23:27.846451 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.495003 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]log ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]etcd ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/priority-and-fairness-filter ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-apiextensions-informers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-apiextensions-controllers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/crd-informer-synced ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-system-namespaces-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 02 07:23:28 crc kubenswrapper[4895]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/bootstrap-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/start-kube-aggregator-informers ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-registration-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-discovery-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]autoregister-completion ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-openapi-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 02 07:23:28 crc kubenswrapper[4895]: livez check failed Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.495147 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.803295 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.803519 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.805261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.805404 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:28 crc kubenswrapper[4895]: I1202 07:23:28.805432 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:29 crc kubenswrapper[4895]: E1202 07:23:29.220915 4895 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.836525 4895 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.836606 4895 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.836866 4895 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.838856 4895 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.839563 4895 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 07:23:32 crc kubenswrapper[4895]: E1202 07:23:32.846901 4895 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.884530 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60734->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.884640 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:60734->192.168.126.11:17697: read: connection reset by peer" Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.884523 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38896->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 02 07:23:32 crc kubenswrapper[4895]: I1202 07:23:32.884793 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38896->192.168.126.11:17697: read: connection reset by peer" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.056500 4895 apiserver.go:52] "Watching apiserver" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059141 4895 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059365 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059795 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059858 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.059911 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059991 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.059818 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.060297 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.060340 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.060319 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.062066 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.063426 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.063864 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.064077 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.064240 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.064328 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.064769 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.064904 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.065061 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.118964 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.135593 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.156141 4895 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.163569 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.176680 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.191797 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.204536 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.217995 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.226871 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.237275 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244007 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244047 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244074 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244098 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244117 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244159 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244178 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244196 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244231 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244271 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244286 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244302 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244320 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244338 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244407 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244424 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244439 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244456 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244494 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244511 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244549 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244588 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244603 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244657 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244674 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244691 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244708 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244725 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244762 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244780 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244815 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244850 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244876 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244897 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244918 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244941 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244965 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244982 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245019 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245038 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245058 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245098 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245261 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245283 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245322 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245363 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245404 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245422 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245439 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245475 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245491 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245536 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245556 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245575 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245602 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245621 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245638 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245672 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245691 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245802 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245826 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245845 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245864 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245881 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245898 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245915 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245932 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245950 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245973 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246010 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246027 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246045 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246064 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246082 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246100 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246119 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246155 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246173 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246195 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246212 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246231 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246251 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246293 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246329 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246348 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246367 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246385 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246404 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246422 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246439 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246455 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246472 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246488 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246525 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246548 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246565 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246602 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246618 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246637 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246653 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246671 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246688 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250031 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250145 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250300 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250341 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250378 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250420 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250462 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250492 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250568 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250606 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250892 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250940 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250969 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251006 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251041 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251103 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251134 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251164 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251194 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251219 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251276 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251304 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251334 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251368 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251413 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251509 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251548 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251578 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251606 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251632 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251657 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251681 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251702 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251723 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251781 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251808 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251907 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251936 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251969 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252028 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252074 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252094 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252117 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252141 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252164 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253029 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253063 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253090 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253137 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253562 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253607 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253636 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253665 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253697 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253726 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253782 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253808 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253828 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253855 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244606 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244861 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244874 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254276 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254389 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254442 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254480 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254596 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254660 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254697 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254724 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254763 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254800 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254831 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254896 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254951 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255006 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255112 4895 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255129 4895 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255149 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255167 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255180 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.244984 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245089 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255848 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245315 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245364 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245490 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245559 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245577 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255941 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245768 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245872 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256087 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.245940 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246007 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246023 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246099 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246125 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246204 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246252 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246312 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246324 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246343 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.246656 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.247556 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.247545 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248016 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248047 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248149 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248347 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248489 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248546 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248824 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248847 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.248960 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249259 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249299 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249351 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249428 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249590 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249596 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249710 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249881 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249953 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.249971 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250215 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250239 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250630 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.250873 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251291 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251346 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251362 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.251447 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252585 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.252946 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253228 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.253322 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254034 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254493 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254659 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254715 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254776 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254795 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.254986 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255131 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255124 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255591 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255688 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.255800 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256014 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256190 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256400 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256429 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256674 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256971 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.256990 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.257117 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.257314 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.257352 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.257860 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.257867 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.257086 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258139 4895 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258204 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258206 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258303 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258294 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258657 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258873 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.258919 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.259010 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.259273 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.259456 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.259468 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.259445 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.259347 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.260052 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:33.760018307 +0000 UTC m=+24.930877920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.260092 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:33.760084329 +0000 UTC m=+24.930943942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.260410 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.260486 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.260869 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.260913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261014 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.260966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261229 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261375 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261427 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261498 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261710 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.261965 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.262119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.262639 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.263504 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.263907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.263926 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.263916 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.262130 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264281 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264375 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264524 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264709 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264859 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264977 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.265078 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.265105 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.265832 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.265876 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.265928 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.266024 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:23:33.765983991 +0000 UTC m=+24.936843614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266595 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266038 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266451 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266631 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.264506 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266661 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.266862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.267238 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.267465 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.267565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.267840 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.268371 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.268565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.269368 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.272106 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.270414 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.272386 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.272625 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.272785 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.272817 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.272833 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.272886 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.272901 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:33.772881142 +0000 UTC m=+24.943740955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.273114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.276463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.276893 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.277181 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.277906 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.278292 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.279150 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.280174 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.280549 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.280619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.280969 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.280607 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.281185 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.281409 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.281583 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.281929 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282041 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282088 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282211 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282210 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282431 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282584 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282765 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282828 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282894 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.282925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.283164 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.283324 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.283381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.283720 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.284701 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.285260 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.285559 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.286205 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.286404 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.287037 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.287064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.287061 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.287731 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.288433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.288516 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.293147 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba" exitCode=255 Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.293201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba"} Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.294896 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.294936 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.294960 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.295050 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:33.795021875 +0000 UTC m=+24.965881678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.295549 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.301726 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.301768 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.305954 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.305969 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.306154 4895 scope.go:117] "RemoveContainer" containerID="a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.315158 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.321175 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.321190 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.326808 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.335692 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.345511 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.356424 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.356722 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.356575 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357014 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357201 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357257 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357288 4895 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357323 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357335 4895 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357348 4895 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357359 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357465 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357488 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357500 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357560 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357572 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357629 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357641 4895 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357650 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357691 4895 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357702 4895 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357731 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357758 4895 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357768 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357777 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357820 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357831 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357842 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357856 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357869 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357881 4895 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357890 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357902 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357913 4895 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357922 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357931 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357940 4895 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357948 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357958 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357967 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357976 4895 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357985 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.357995 4895 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358004 4895 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358012 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358021 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358031 4895 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358042 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358054 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358064 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358078 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358089 4895 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358099 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358107 4895 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358116 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358145 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358190 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358201 4895 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358211 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358221 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358231 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358241 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358252 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358262 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358306 4895 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358315 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358324 4895 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358332 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358341 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358350 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358360 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358368 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358417 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358429 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358439 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358449 4895 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358495 4895 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358506 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358515 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358560 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358612 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358621 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358682 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358696 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358825 4895 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358843 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358856 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.358959 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359078 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359088 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359097 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359105 4895 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359115 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359123 4895 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359131 4895 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359166 4895 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359175 4895 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359184 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359194 4895 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359201 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359212 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359221 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359231 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359240 4895 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359249 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359257 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359265 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359275 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359284 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359293 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359302 4895 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359311 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359319 4895 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359327 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359337 4895 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359348 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359358 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359367 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359376 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359385 4895 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359394 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359402 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359411 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359419 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359427 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359436 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359444 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359453 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359461 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359496 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359504 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359512 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359521 4895 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359530 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359556 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359565 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359665 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359682 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359729 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359760 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359773 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359782 4895 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359832 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359859 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359872 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359885 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359896 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359906 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359919 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359930 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359942 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359954 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359964 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359975 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.359988 4895 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360001 4895 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360011 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360054 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360066 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360075 4895 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360087 4895 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360100 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360112 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360124 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360136 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360147 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360160 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360726 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360790 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360808 4895 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360830 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360844 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360899 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360912 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360926 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360939 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360951 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360964 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.360976 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361021 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361033 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361044 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361056 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361067 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361097 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361108 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361121 4895 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361211 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361224 4895 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361236 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361249 4895 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361259 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361270 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.361282 4895 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.381011 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.387351 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.394893 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:23:33 crc kubenswrapper[4895]: W1202 07:23:33.426859 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2d4c4d3fc0a244c1f38532788cb5e0c2fdf2af31b9a44ca9e6a08b21f11632f2 WatchSource:0}: Error finding container 2d4c4d3fc0a244c1f38532788cb5e0c2fdf2af31b9a44ca9e6a08b21f11632f2: Status 404 returned error can't find the container with id 2d4c4d3fc0a244c1f38532788cb5e0c2fdf2af31b9a44ca9e6a08b21f11632f2 Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.500508 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.514318 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.527258 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.536678 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.550425 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.561413 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.574953 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.591110 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.765310 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.765370 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.765492 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.765551 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:34.765533315 +0000 UTC m=+25.936392928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.765565 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.765692 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:34.765664809 +0000 UTC m=+25.936524432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.865641 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.865716 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:33 crc kubenswrapper[4895]: I1202 07:23:33.865778 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.865876 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:23:34.865834871 +0000 UTC m=+26.036694504 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.865899 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.865920 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.865932 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.865988 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:34.865971805 +0000 UTC m=+26.036831418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.866042 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.866099 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.866123 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:33 crc kubenswrapper[4895]: E1202 07:23:33.866221 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:34.866190912 +0000 UTC m=+26.037050555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.015645 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.023251 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.025074 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.038139 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.052151 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.068313 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.087151 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.100560 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.118025 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.131594 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.140571 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.140770 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.148211 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.161821 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.183607 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.203297 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.226559 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.238865 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.243385 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.261908 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.292891 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.311645 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.314069 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.314466 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.315960 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7e55354e8a19db748f231229f2cf524be7cbb6ff7c6dd87db5d7217bfddafc3c"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.317950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.318019 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.318042 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cee4c167293ab3438e6d1d724930937b394abcc3aa02b24719967a030cbad852"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.319387 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.319426 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2d4c4d3fc0a244c1f38532788cb5e0c2fdf2af31b9a44ca9e6a08b21f11632f2"} Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.319890 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.349547 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.364403 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.379376 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.396202 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.414566 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.428602 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.450283 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.466119 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.483611 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.503069 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.520365 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.538022 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.551726 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.566286 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.580815 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.602884 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.775889 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.775944 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.776061 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.776116 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:36.776099816 +0000 UTC m=+27.946959429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.776276 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.776418 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:36.776384985 +0000 UTC m=+27.947244768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.876613 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.876775 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.876889 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:23:36.876843995 +0000 UTC m=+28.047703608 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.876999 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877037 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: I1202 07:23:34.877041 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877065 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877170 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:36.877135394 +0000 UTC m=+28.047995167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877238 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877268 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877286 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:34 crc kubenswrapper[4895]: E1202 07:23:34.877349 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:36.8773385 +0000 UTC m=+28.048198113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.141146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.141185 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:35 crc kubenswrapper[4895]: E1202 07:23:35.141335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:35 crc kubenswrapper[4895]: E1202 07:23:35.141594 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.148529 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.149068 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.149965 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.150605 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.151217 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.151735 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.152361 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.152927 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.153511 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.154027 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.157229 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.160148 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.161538 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.163139 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.164455 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.165412 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.166107 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.167223 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.167842 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.168529 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.169446 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.170028 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.170436 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.171463 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.171929 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.172958 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.173571 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.174477 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.175071 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.176013 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.176470 4895 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.176569 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.179344 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.181363 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.182637 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.185862 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.187282 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.188339 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.189627 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.191038 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.191998 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.194356 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.195905 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.198028 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.199894 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.201442 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.202069 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.202817 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.203359 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.203832 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.204294 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.206199 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.207515 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 07:23:35 crc kubenswrapper[4895]: I1202 07:23:35.208273 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.048099 4895 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.050673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.050931 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.051120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.051458 4895 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.063431 4895 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.063974 4895 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.066052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.066105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.066122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.066148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.066169 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.103806 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.111031 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.111331 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.111512 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.111672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.111838 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.135393 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.140589 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.140926 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.141549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.141733 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.142219 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.142300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.142326 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.161820 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.168313 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.168574 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.168769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.168999 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.169176 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.195562 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.204154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.204207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.204226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.204257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.204278 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.229331 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.229471 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.231770 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.231805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.231817 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.231836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.231850 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.334890 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.334947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.334960 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.334982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.335002 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.438722 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.438829 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.438849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.438879 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.438899 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.541832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.541930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.541949 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.541975 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.541994 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.645452 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.645522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.645538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.645564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.645582 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.749538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.749983 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.750149 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.750316 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.750463 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.795640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.795979 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.796143 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:40.796110526 +0000 UTC m=+31.966970169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.796339 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.796620 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.796812 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:40.796772195 +0000 UTC m=+31.967631848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.854385 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.854470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.854494 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.854528 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.854552 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.896995 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897234 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:23:40.897193715 +0000 UTC m=+32.068053358 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.897458 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.897567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897724 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897814 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897817 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897844 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897860 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897883 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897927 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:40.897910256 +0000 UTC m=+32.068769909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:36 crc kubenswrapper[4895]: E1202 07:23:36.897956 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:40.897944857 +0000 UTC m=+32.068804500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.958735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.959237 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.959323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.959393 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:36 crc kubenswrapper[4895]: I1202 07:23:36.959452 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:36Z","lastTransitionTime":"2025-12-02T07:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.062987 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.063064 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.063086 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.063171 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.063190 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.140953 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.141115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:37 crc kubenswrapper[4895]: E1202 07:23:37.141453 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:37 crc kubenswrapper[4895]: E1202 07:23:37.141677 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.166398 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.166456 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.166468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.166489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.166502 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.269373 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.269445 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.269467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.269499 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.269520 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.329973 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.351668 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.370377 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.372433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.372477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.372488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.372508 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.372521 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.386408 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.404070 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.430669 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.451176 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.468653 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.475404 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.475472 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.475490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.475533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.475545 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.502778 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.578003 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.578048 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.578059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.578076 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.578086 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.680695 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.680763 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.680778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.680800 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.680817 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.782818 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.782851 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.782861 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.782876 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.782885 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.886082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.886132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.886146 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.886166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.886183 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.989193 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.989256 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.989270 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.989293 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:37 crc kubenswrapper[4895]: I1202 07:23:37.989308 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:37Z","lastTransitionTime":"2025-12-02T07:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.092712 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.092781 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.092792 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.092816 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.092829 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.140216 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:38 crc kubenswrapper[4895]: E1202 07:23:38.140376 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.195249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.195306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.195328 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.195351 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.195367 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.278480 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-74fkh"] Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.278824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.280878 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.280887 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.281331 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.294322 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.298139 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.298189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.298200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.298218 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.298230 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.308266 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.312057 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpwj\" (UniqueName: \"kubernetes.io/projected/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-kube-api-access-hhpwj\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.312113 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-hosts-file\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.321172 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.340154 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.360608 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.379329 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.399433 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.401377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.401412 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.401423 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.401442 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.401452 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.412898 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpwj\" (UniqueName: \"kubernetes.io/projected/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-kube-api-access-hhpwj\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.412976 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-hosts-file\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.413140 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-hosts-file\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.416833 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.428950 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.433519 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpwj\" (UniqueName: \"kubernetes.io/projected/8b212bd8-f1a7-4982-b2e6-499c13a34b0c-kube-api-access-hhpwj\") pod \"node-resolver-74fkh\" (UID: \"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\") " pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.505249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.505304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.505324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.505344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.505358 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.590321 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-74fkh" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.608812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.609467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.609487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.609514 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.609533 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.662578 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wfcg7"] Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.663427 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n7xcr"] Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.664557 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.665068 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hlxqt"] Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.665993 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.666159 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.667912 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.668378 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.669193 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.669221 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670339 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670542 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670637 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670662 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670559 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.670909 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.674837 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.675056 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.689052 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.702969 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.713247 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.713283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.713293 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.713316 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.713329 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714759 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-system-cni-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714822 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714843 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq58\" (UniqueName: \"kubernetes.io/projected/3514f381-d0d1-4e00-931e-c5ca75202a1b-kube-api-access-tjq58\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714881 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-socket-dir-parent\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714898 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-multus\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-os-release\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-conf-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-system-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.714989 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfwjt\" (UniqueName: \"kubernetes.io/projected/0468d2d1-a975-45a6-af9f-47adc432fab0-kube-api-access-vfwjt\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715027 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-cni-binary-copy\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715047 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0468d2d1-a975-45a6-af9f-47adc432fab0-rootfs\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715093 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-multus-certs\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715110 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-cnibin\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715213 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-cnibin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715271 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-bin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715302 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-etc-kubernetes\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715329 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6l8j\" (UniqueName: \"kubernetes.io/projected/30911fe5-208f-44e8-a380-2a0093f24863-kube-api-access-z6l8j\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715348 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0468d2d1-a975-45a6-af9f-47adc432fab0-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715372 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-hostroot\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715394 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0468d2d1-a975-45a6-af9f-47adc432fab0-proxy-tls\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715410 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-os-release\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715429 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-kubelet\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715446 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-netns\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-multus-daemon-config\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.715555 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-k8s-cni-cncf-io\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.717013 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.727135 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.745837 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.759037 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.774073 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.788555 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.802999 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.815657 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.815710 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.815729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.815774 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.815791 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.816619 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817127 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-k8s-cni-cncf-io\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817167 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-system-cni-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817201 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq58\" (UniqueName: \"kubernetes.io/projected/3514f381-d0d1-4e00-931e-c5ca75202a1b-kube-api-access-tjq58\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-socket-dir-parent\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-multus\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817304 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817321 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-conf-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817344 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-os-release\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817363 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-system-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817389 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-cni-binary-copy\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817409 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfwjt\" (UniqueName: \"kubernetes.io/projected/0468d2d1-a975-45a6-af9f-47adc432fab0-kube-api-access-vfwjt\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817444 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817466 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0468d2d1-a975-45a6-af9f-47adc432fab0-rootfs\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817491 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-multus-certs\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-cnibin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817528 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-bin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817545 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-etc-kubernetes\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-cnibin\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817590 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6l8j\" (UniqueName: \"kubernetes.io/projected/30911fe5-208f-44e8-a380-2a0093f24863-kube-api-access-z6l8j\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817613 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0468d2d1-a975-45a6-af9f-47adc432fab0-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-hostroot\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0468d2d1-a975-45a6-af9f-47adc432fab0-proxy-tls\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817671 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-os-release\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817696 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-kubelet\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-netns\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.817771 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-multus-daemon-config\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818343 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-cnibin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-kubelet\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818459 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-os-release\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818491 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-cnibin\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818548 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-netns\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818588 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-conf-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-multus-certs\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818688 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-os-release\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818730 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-run-k8s-cni-cncf-io\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818708 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-system-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818805 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-system-cni-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-bin\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818848 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-host-var-lib-cni-multus\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818886 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-hostroot\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818903 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-socket-dir-parent\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818962 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-multus-cni-dir\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.818424 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/30911fe5-208f-44e8-a380-2a0093f24863-etc-kubernetes\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.819068 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0468d2d1-a975-45a6-af9f-47adc432fab0-rootfs\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.819325 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-multus-daemon-config\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.819513 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0468d2d1-a975-45a6-af9f-47adc432fab0-mcd-auth-proxy-config\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.819846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.820365 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3514f381-d0d1-4e00-931e-c5ca75202a1b-cni-binary-copy\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.820831 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/30911fe5-208f-44e8-a380-2a0093f24863-cni-binary-copy\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.823905 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0468d2d1-a975-45a6-af9f-47adc432fab0-proxy-tls\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.828857 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.834209 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq58\" (UniqueName: \"kubernetes.io/projected/3514f381-d0d1-4e00-931e-c5ca75202a1b-kube-api-access-tjq58\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.836409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3514f381-d0d1-4e00-931e-c5ca75202a1b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n7xcr\" (UID: \"3514f381-d0d1-4e00-931e-c5ca75202a1b\") " pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.838576 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6l8j\" (UniqueName: \"kubernetes.io/projected/30911fe5-208f-44e8-a380-2a0093f24863-kube-api-access-z6l8j\") pod \"multus-hlxqt\" (UID: \"30911fe5-208f-44e8-a380-2a0093f24863\") " pod="openshift-multus/multus-hlxqt" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.839405 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfwjt\" (UniqueName: \"kubernetes.io/projected/0468d2d1-a975-45a6-af9f-47adc432fab0-kube-api-access-vfwjt\") pod \"machine-config-daemon-wfcg7\" (UID: \"0468d2d1-a975-45a6-af9f-47adc432fab0\") " pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.841617 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.855233 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.870126 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.885846 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.899134 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.913070 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.919031 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.919074 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.919084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.919101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.919111 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:38Z","lastTransitionTime":"2025-12-02T07:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.929400 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.948473 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.963154 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.974194 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.982219 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.990613 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:38 crc kubenswrapper[4895]: I1202 07:23:38.996401 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.007496 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hlxqt" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.022051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.022091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.022105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.022603 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.022628 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.047156 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w54m4"] Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.048824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.056150 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.056176 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.056816 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.057596 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.057821 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.058101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.058600 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.078315 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.100610 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.117879 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.119926 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.119964 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.119986 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120013 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120038 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120080 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120201 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120610 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120634 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120950 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.120992 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121017 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldkg\" (UniqueName: \"kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121258 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121302 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121324 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121652 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.121688 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.125274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.125309 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.125324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.125343 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.125355 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.129531 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.140986 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:39 crc kubenswrapper[4895]: E1202 07:23:39.141254 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.141320 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:39 crc kubenswrapper[4895]: E1202 07:23:39.141484 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.143986 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.159115 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.173606 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.200326 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222357 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222406 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222451 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldkg\" (UniqueName: \"kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222488 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222506 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222528 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222532 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222595 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222554 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222623 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222691 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222767 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222794 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222870 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222905 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222935 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.222986 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223056 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223394 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223447 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223515 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.223556 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.224437 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.224487 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.224897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.225262 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.225322 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.226113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.235207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.238709 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.244176 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.244204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.244215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.244232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.244244 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.258622 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldkg\" (UniqueName: \"kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg\") pod \"ovnkube-node-w54m4\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.296140 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.316043 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.332267 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.337126 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-74fkh" event={"ID":"8b212bd8-f1a7-4982-b2e6-499c13a34b0c","Type":"ContainerStarted","Data":"b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.337178 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-74fkh" event={"ID":"8b212bd8-f1a7-4982-b2e6-499c13a34b0c","Type":"ContainerStarted","Data":"18cd9d1614053c5d4fe4e03dbc3c8d57b6dcd2ee79ee769176cdcb840d806ce7"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.340045 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerStarted","Data":"87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.340074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerStarted","Data":"7e47fb60366c162a9600c816cae069db4687007c1993bfc933df8e356472ba96"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.348608 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.350449 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.350507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.350521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.350543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.350559 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.351345 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.351390 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.351401 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"b449c17e38ac26e162a776ab565fe797f2f899ddd3f93935d03836c52037e431"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.353869 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerStarted","Data":"3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.353925 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerStarted","Data":"8002f66093a34e1df48af72aad4208146baec61601b2f92b88e63779154e17a7"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.362521 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.365754 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.374478 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: W1202 07:23:39.379371 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafc3334a_0153_4dcc_9a56_92f6cae51c08.slice/crio-49b65f34552e500042cf1b7a788b223005dea6e388c4dde91984023ebc9f0827 WatchSource:0}: Error finding container 49b65f34552e500042cf1b7a788b223005dea6e388c4dde91984023ebc9f0827: Status 404 returned error can't find the container with id 49b65f34552e500042cf1b7a788b223005dea6e388c4dde91984023ebc9f0827 Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.387580 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.404405 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.420174 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.436843 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.452289 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.454148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.454192 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.454204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.454224 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.454234 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.471725 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.484709 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.500121 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.514885 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.537762 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.553818 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.557665 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.557720 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.557731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.557769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.557783 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.569323 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.579848 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.594575 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.615586 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.628955 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.642099 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.657278 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.660492 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.660538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.660547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.660568 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.660583 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.672909 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.695618 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.712918 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.726763 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.738040 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.750697 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.763037 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.763075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.763085 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.763105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.763117 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.766255 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.779785 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.805252 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.817443 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.832698 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.865676 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.865716 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.865728 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.865760 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.865773 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.871220 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.907867 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.952934 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.969036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.969100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.969111 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.969131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.969143 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:39Z","lastTransitionTime":"2025-12-02T07:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:39 crc kubenswrapper[4895]: I1202 07:23:39.991536 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.030296 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.069974 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.071561 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.071905 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.071915 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.071930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.071938 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.110382 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.139989 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.140367 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.149477 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.174488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.174529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.174538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.174558 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.174569 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.276691 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.276722 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.276731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.276768 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.276778 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.358457 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015" exitCode=0 Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.358547 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.361629 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" exitCode=0 Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.361664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.361687 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"49b65f34552e500042cf1b7a788b223005dea6e388c4dde91984023ebc9f0827"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.371623 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.385647 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.385688 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.385697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.385716 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.385751 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.388849 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.401546 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.414076 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.433365 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.445915 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.458520 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.474335 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.488080 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.488108 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.488116 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.488131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.488142 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.508904 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.549400 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590428 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590798 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590840 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590850 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590867 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.590877 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.631870 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.681830 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.694633 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.694709 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.694731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.694782 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.694801 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.709807 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.751271 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.797115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.797178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.797190 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.797248 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.797263 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.799707 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.830891 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.839342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.839597 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.839642 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.839891 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:48.839867494 +0000 UTC m=+40.010727107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.839720 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.840093 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:48.840083911 +0000 UTC m=+40.010943524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.870979 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.906346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.906390 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.906402 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.906421 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.906441 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:40Z","lastTransitionTime":"2025-12-02T07:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.922579 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.940081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.940195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.940251 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940415 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940433 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940445 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940490 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:48.940475909 +0000 UTC m=+40.111335512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940796 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:23:48.940789059 +0000 UTC m=+40.111648672 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940906 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940950 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.940972 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:40 crc kubenswrapper[4895]: E1202 07:23:40.941059 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:23:48.941034166 +0000 UTC m=+40.111893779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.956527 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:40 crc kubenswrapper[4895]: I1202 07:23:40.992832 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:40Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.008902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.008937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.008946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.008962 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.008974 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.035940 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.070961 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.111175 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.112391 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.112455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.112474 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.112504 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.112522 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.140062 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.140174 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:41 crc kubenswrapper[4895]: E1202 07:23:41.140225 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:41 crc kubenswrapper[4895]: E1202 07:23:41.140405 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.149926 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.191402 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.215276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.215310 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.215321 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.215337 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.215346 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.318559 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.318604 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.318620 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.318642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.318657 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370155 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370177 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370198 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370216 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.370236 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.371961 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565" exitCode=0 Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.372018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.388070 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.408405 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.421860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.421910 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.421926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.421947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.421958 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.429112 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.445693 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.465674 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.480029 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.494858 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.511878 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.524891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.524957 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.524973 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.524998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.525013 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.553469 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.593628 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.628512 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.628588 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.628612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.628645 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.628669 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.637080 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.673098 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.712134 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.731537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.731587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.731597 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.731613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.731625 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.835430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.835500 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.835529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.835564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.835593 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.939070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.939137 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.939166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.939192 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:41 crc kubenswrapper[4895]: I1202 07:23:41.939213 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:41Z","lastTransitionTime":"2025-12-02T07:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.042200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.042860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.042916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.042938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.042950 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.140891 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:42 crc kubenswrapper[4895]: E1202 07:23:42.141123 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.146671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.146724 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.146773 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.146798 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.146818 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.250638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.250713 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.250737 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.250800 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.250820 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.354285 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.354351 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.354366 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.354395 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.354411 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.380039 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872" exitCode=0 Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.380104 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.410587 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.440080 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.457837 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.457883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.457904 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.457928 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.457944 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.473110 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.495901 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.513226 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.523988 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.537462 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.553688 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.559786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.559840 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.559850 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.559866 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.559876 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.567936 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.585632 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.604184 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.619640 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.638375 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.662369 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.662407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.662418 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.662437 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.662458 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.765058 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.765091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.765100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.765114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.765122 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.867577 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.867615 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.867624 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.867638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.867647 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.970322 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.970355 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.970363 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.970378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:42 crc kubenswrapper[4895]: I1202 07:23:42.970388 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:42Z","lastTransitionTime":"2025-12-02T07:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.075838 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.076359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.076614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.076913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.077059 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.140281 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:43 crc kubenswrapper[4895]: E1202 07:23:43.140591 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.140297 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:43 crc kubenswrapper[4895]: E1202 07:23:43.142003 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.189210 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.189302 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.189378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.189430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.189448 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.295102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.295159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.295174 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.295197 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.295210 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.394085 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9" exitCode=0 Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.394221 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.397833 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.397898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.397920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.397951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.397970 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.404720 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.421697 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.437845 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.451521 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.463857 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.475697 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.493876 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.503131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.503188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.503205 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.503233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.503251 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.508091 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.528450 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.549425 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.567635 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.586717 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.603844 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.606965 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.606988 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.606996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.607016 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.607028 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.619011 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.709830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.709915 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.709942 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.709976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.709998 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.813512 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.813569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.813589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.813619 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.813638 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.921032 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.921088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.921105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.921127 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.921178 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:43Z","lastTransitionTime":"2025-12-02T07:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.937811 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lhpbd"] Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.938261 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.941123 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.941291 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.941556 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.941820 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.965717 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.976233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81128292-8c02-45bd-9b25-e9457e989975-host\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.976293 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z5n2\" (UniqueName: \"kubernetes.io/projected/81128292-8c02-45bd-9b25-e9457e989975-kube-api-access-2z5n2\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.976370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81128292-8c02-45bd-9b25-e9457e989975-serviceca\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:43 crc kubenswrapper[4895]: I1202 07:23:43.981720 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:43Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.005563 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.024416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.024461 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.024478 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.024505 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.024519 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.026110 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.048250 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.065419 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.077888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81128292-8c02-45bd-9b25-e9457e989975-serviceca\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.078503 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81128292-8c02-45bd-9b25-e9457e989975-host\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.078545 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z5n2\" (UniqueName: \"kubernetes.io/projected/81128292-8c02-45bd-9b25-e9457e989975-kube-api-access-2z5n2\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.078644 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81128292-8c02-45bd-9b25-e9457e989975-host\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.080320 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/81128292-8c02-45bd-9b25-e9457e989975-serviceca\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.081127 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.108324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z5n2\" (UniqueName: \"kubernetes.io/projected/81128292-8c02-45bd-9b25-e9457e989975-kube-api-access-2z5n2\") pod \"node-ca-lhpbd\" (UID: \"81128292-8c02-45bd-9b25-e9457e989975\") " pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.118056 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.128982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.129067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.129102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.129145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.129176 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.133551 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.140556 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:44 crc kubenswrapper[4895]: E1202 07:23:44.140659 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.154083 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.176693 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.195305 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.248171 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lhpbd" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.248933 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.251348 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.251468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.251549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.251677 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.251797 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.274188 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: W1202 07:23:44.282050 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81128292_8c02_45bd_9b25_e9457e989975.slice/crio-83602ceeb13395aa0ddd7522ce8bb2fbf50ba50a14e87ac2001c683139ef3a88 WatchSource:0}: Error finding container 83602ceeb13395aa0ddd7522ce8bb2fbf50ba50a14e87ac2001c683139ef3a88: Status 404 returned error can't find the container with id 83602ceeb13395aa0ddd7522ce8bb2fbf50ba50a14e87ac2001c683139ef3a88 Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.353984 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.354013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.354021 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.354036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.354045 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.414538 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852" exitCode=0 Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.414651 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.417001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbd" event={"ID":"81128292-8c02-45bd-9b25-e9457e989975","Type":"ContainerStarted","Data":"83602ceeb13395aa0ddd7522ce8bb2fbf50ba50a14e87ac2001c683139ef3a88"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.437216 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.451324 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.456403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.456445 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.456459 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.456484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.456500 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.463552 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.478059 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.495303 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.511841 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.524869 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.554158 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.559728 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.559796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.559808 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.559829 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.559840 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.573408 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.589683 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.601013 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.616030 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.632214 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.645849 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.662806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.662860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.662871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.662891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.662904 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.765430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.765477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.765488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.765508 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.765521 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.869279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.869341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.869353 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.869376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.869386 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.972493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.972542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.972552 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.972571 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:44 crc kubenswrapper[4895]: I1202 07:23:44.972582 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:44Z","lastTransitionTime":"2025-12-02T07:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.075814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.075870 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.075881 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.075899 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.075913 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.141129 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.141131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:45 crc kubenswrapper[4895]: E1202 07:23:45.141291 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:45 crc kubenswrapper[4895]: E1202 07:23:45.141404 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.183640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.183712 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.183725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.183766 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.183779 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.286162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.286645 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.286659 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.286679 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.286691 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.389308 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.389346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.389356 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.389373 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.389383 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.431375 4895 generic.go:334] "Generic (PLEG): container finished" podID="3514f381-d0d1-4e00-931e-c5ca75202a1b" containerID="3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc" exitCode=0 Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.431438 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerDied","Data":"3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.439128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lhpbd" event={"ID":"81128292-8c02-45bd-9b25-e9457e989975","Type":"ContainerStarted","Data":"1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.446166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.447083 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.447147 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.447173 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.450364 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.466651 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.473500 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.485877 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.492712 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.492761 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.492773 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.492796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.492811 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.496244 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.499036 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.518312 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.531353 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.542993 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.558927 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.571361 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.583800 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.595567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.595614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.595626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.595647 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.595658 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.596499 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.607893 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.624995 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.638608 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.651723 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.664646 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.691660 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.697799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.697856 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.697867 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.697883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.697896 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.707106 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.722772 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.734197 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.747808 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.766703 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.782996 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.800180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.800245 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.800259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.800280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.800296 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.803827 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.822319 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.838636 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.873577 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.903134 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.903180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.903193 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.903216 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.903228 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:45Z","lastTransitionTime":"2025-12-02T07:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:45 crc kubenswrapper[4895]: I1202 07:23:45.911661 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:45Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.006912 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.006965 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.006975 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.007005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.007015 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.109150 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.109196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.109207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.109227 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.109238 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.140401 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.140543 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.211371 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.211418 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.211426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.211448 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.211463 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.315141 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.315238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.315266 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.315304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.315332 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.419064 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.419130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.419146 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.419175 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.419191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.421250 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.421323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.421347 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.421382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.421409 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.440232 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.445810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.445866 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.445878 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.445902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.445919 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.457199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" event={"ID":"3514f381-d0d1-4e00-931e-c5ca75202a1b","Type":"ContainerStarted","Data":"2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071"} Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.472207 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.478245 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.478340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.478362 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.478391 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.478413 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.480628 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.500442 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.501831 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.505820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.505883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.505902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.505923 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.505937 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.521789 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.524818 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.528177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.528249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.528268 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.528296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.528317 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.544663 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.547007 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: E1202 07:23:46.547240 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.549822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.549891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.549926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.549950 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.549965 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.561591 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.575304 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.592144 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.622963 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.644337 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.653191 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.653241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.653253 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.653274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.653288 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.664317 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.679880 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.696175 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.717371 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.745588 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:46Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.756914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.756992 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.757018 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.757052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.757076 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.860236 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.860315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.860335 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.860367 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.860388 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.975158 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.975608 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.975649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.975690 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:46 crc kubenswrapper[4895]: I1202 07:23:46.975719 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:46Z","lastTransitionTime":"2025-12-02T07:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.083022 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.083107 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.083130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.083161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.083185 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.141184 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.141190 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:47 crc kubenswrapper[4895]: E1202 07:23:47.141396 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:47 crc kubenswrapper[4895]: E1202 07:23:47.141668 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.186236 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.186285 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.186300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.186321 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.186337 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.289509 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.289568 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.289579 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.289597 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.289612 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.394137 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.394187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.394200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.394238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.394255 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.496623 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.496707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.496729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.496796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.496825 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.600061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.600140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.600159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.600214 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.600235 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.705166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.705904 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.705932 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.705976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.706003 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.808895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.808970 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.808981 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.808998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.809007 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.912354 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.912461 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.912488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.912521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:47 crc kubenswrapper[4895]: I1202 07:23:47.912540 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:47Z","lastTransitionTime":"2025-12-02T07:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.015505 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.015583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.015608 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.015642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.015664 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.119330 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.119413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.119430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.119458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.119477 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.141104 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:48 crc kubenswrapper[4895]: E1202 07:23:48.141327 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.223012 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.223062 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.223074 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.223098 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.223113 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.326510 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.326593 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.326611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.326643 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.326667 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.429967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.430044 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.430068 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.430100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.430121 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.468575 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/0.log" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.473327 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327" exitCode=1 Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.473425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.475025 4895 scope.go:117] "RemoveContainer" containerID="062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.497890 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.519150 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.533671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.533721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.533732 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.533778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.533792 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.543788 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.559240 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.578972 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.590627 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.606938 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.627335 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.636618 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.636656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.636666 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.636685 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.636695 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.644537 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.664804 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.690623 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.707996 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.729832 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.738913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.738967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.738980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.739001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.739013 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.744595 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.841711 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.841802 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.841819 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.841843 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.841855 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.936643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.936737 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:48 crc kubenswrapper[4895]: E1202 07:23:48.937047 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:48 crc kubenswrapper[4895]: E1202 07:23:48.937183 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:04.937147745 +0000 UTC m=+56.108007408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:23:48 crc kubenswrapper[4895]: E1202 07:23:48.937859 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:48 crc kubenswrapper[4895]: E1202 07:23:48.937980 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:04.937946498 +0000 UTC m=+56.108806151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.945036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.945080 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.945100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.945129 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:48 crc kubenswrapper[4895]: I1202 07:23:48.945147 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:48Z","lastTransitionTime":"2025-12-02T07:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.037864 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038013 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:24:05.037991947 +0000 UTC m=+56.208851560 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.038178 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.038207 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038355 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038371 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038383 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038416 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:05.038409068 +0000 UTC m=+56.209268681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038355 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038462 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038475 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.038524 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:05.038511711 +0000 UTC m=+56.209371334 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.048049 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.048089 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.048102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.048120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.048136 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.140778 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.140918 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.141247 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:49 crc kubenswrapper[4895]: E1202 07:23:49.141298 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.150343 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.150396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.150410 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.150442 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.150458 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.160820 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.178267 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.200877 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.222312 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.247238 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.252584 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.252614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.252833 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.253672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.253689 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.270273 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.282455 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.294321 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.305659 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.318280 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.333444 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.350799 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.357274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.357312 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.357323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.357341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.357353 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.368771 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.386211 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.460477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.460519 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.460530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.460551 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.460561 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.481209 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/0.log" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.484723 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.485241 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.500168 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.515694 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.547803 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.563232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.563322 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.563346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.563385 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.563409 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.566871 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.584707 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.603082 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.629391 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.648405 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.666498 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.666557 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.666569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.666587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.666600 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.671695 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.690256 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.711816 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.726344 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.740708 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.752391 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.769571 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.769602 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.769613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.769628 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.769640 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.872894 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.872961 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.872980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.873007 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.873025 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.976310 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.976384 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.976404 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.976433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:49 crc kubenswrapper[4895]: I1202 07:23:49.976452 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:49Z","lastTransitionTime":"2025-12-02T07:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.079866 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.079951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.079977 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.080012 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.080040 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.140509 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:50 crc kubenswrapper[4895]: E1202 07:23:50.140837 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.183115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.183177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.183196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.183221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.183240 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.286295 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.286362 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.286378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.286399 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.286412 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.389932 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.390612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.390649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.390679 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.390705 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.492844 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/1.log" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.494554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.494624 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.494649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.494681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.494706 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.495105 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/0.log" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.499421 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d" exitCode=1 Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.499479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.499555 4895 scope.go:117] "RemoveContainer" containerID="062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.501202 4895 scope.go:117] "RemoveContainer" containerID="370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d" Dec 02 07:23:50 crc kubenswrapper[4895]: E1202 07:23:50.501527 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.524419 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.539618 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.559592 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.582952 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.598184 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.598261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.598280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.598315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.598335 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.606092 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.630254 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.652556 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.666937 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.686648 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.701636 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.701964 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.701995 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.702028 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.702052 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.704858 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.720310 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.737057 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.755451 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.778608 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:50Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.805104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.805154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.805168 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.805187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.805201 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.908822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.908883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.908906 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.908937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:50 crc kubenswrapper[4895]: I1202 07:23:50.908957 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:50Z","lastTransitionTime":"2025-12-02T07:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.012848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.012944 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.012968 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.013008 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.013032 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.116238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.116300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.116324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.116355 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.116382 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.140797 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:51 crc kubenswrapper[4895]: E1202 07:23:51.141034 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.141837 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:51 crc kubenswrapper[4895]: E1202 07:23:51.141953 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.149274 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.169707 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.190157 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.191004 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p"] Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.192012 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.195732 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.196976 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.215452 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.219320 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.219370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.219389 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.219416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.219434 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.241433 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.265985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.266074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj6l9\" (UniqueName: \"kubernetes.io/projected/6a0b438f-70fd-4c3f-a170-2b3fe9797955-kube-api-access-gj6l9\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.266163 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.266218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.288360 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.307780 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.321881 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.321968 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.321991 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.322023 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.322044 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.332330 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.352657 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.367911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj6l9\" (UniqueName: \"kubernetes.io/projected/6a0b438f-70fd-4c3f-a170-2b3fe9797955-kube-api-access-gj6l9\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.368158 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.368215 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.368377 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.369470 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.370594 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.371121 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.375016 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6a0b438f-70fd-4c3f-a170-2b3fe9797955-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.390492 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.396906 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj6l9\" (UniqueName: \"kubernetes.io/projected/6a0b438f-70fd-4c3f-a170-2b3fe9797955-kube-api-access-gj6l9\") pod \"ovnkube-control-plane-749d76644c-wxp5p\" (UID: \"6a0b438f-70fd-4c3f-a170-2b3fe9797955\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.410249 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.425563 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.425638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.425659 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.425694 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.425720 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.426607 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.445149 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.460223 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.475120 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.489904 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.504976 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/1.log" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.505667 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.508932 4895 scope.go:117] "RemoveContainer" containerID="370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d" Dec 02 07:23:51 crc kubenswrapper[4895]: E1202 07:23:51.509094 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.517944 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.525307 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.528598 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.528643 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.528656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.528677 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.528690 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.543719 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.556135 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.566119 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.580457 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.596193 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.609874 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.627780 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://062dc017d6ff8750938c69f755c83863dc927df0e6a1a42671ab0c04b5d70327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:47Z\\\",\\\"message\\\":\\\"ndler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:23:47.842861 6190 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:23:47.842911 6190 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:23:47.842975 6190 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843039 6190 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:23:47.843839 6190 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:23:47.843870 6190 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:23:47.843888 6190 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:23:47.843895 6190 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:23:47.843946 6190 factory.go:656] Stopping watch factory\\\\nI1202 07:23:47.843968 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:23:47.843997 6190 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:23:47.843993 6190 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:23:47.844025 6190 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.631935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.631964 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.631974 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.631990 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.632003 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.640216 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.654098 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.666629 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.678659 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.692650 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.704360 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.714764 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.731788 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.733989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.734008 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.734016 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.734031 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.734039 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.751252 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.764015 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.779816 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.797710 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.810399 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.823973 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.834167 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.842588 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.842766 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.842840 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.842916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.842986 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.847595 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.864465 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.909112 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.946131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.946181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.946196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.946226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.946241 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:51Z","lastTransitionTime":"2025-12-02T07:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:51 crc kubenswrapper[4895]: I1202 07:23:51.970699 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:51Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.049251 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.049651 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.049766 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.049869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.049947 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.140253 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.140732 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.153097 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.153155 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.153166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.153185 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.153199 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.256415 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.256462 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.256473 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.256490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.256503 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.322343 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5f88v"] Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.323296 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.323469 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.342927 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.359327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.359374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.359388 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.359407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.359418 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.363850 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.379630 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.381043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.381147 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfrv\" (UniqueName: \"kubernetes.io/projected/5af25091-1401-45d4-ae53-d2b469c879da-kube-api-access-6wfrv\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.395787 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.410396 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.428475 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.441638 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.456206 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.462449 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.462489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.462501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.462521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.462535 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.472610 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.482520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfrv\" (UniqueName: \"kubernetes.io/projected/5af25091-1401-45d4-ae53-d2b469c879da-kube-api-access-6wfrv\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.482636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.482880 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.482993 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:23:52.982965376 +0000 UTC m=+44.153825019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.493442 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.507943 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfrv\" (UniqueName: \"kubernetes.io/projected/5af25091-1401-45d4-ae53-d2b469c879da-kube-api-access-6wfrv\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.515248 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" event={"ID":"6a0b438f-70fd-4c3f-a170-2b3fe9797955","Type":"ContainerStarted","Data":"8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.515322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" event={"ID":"6a0b438f-70fd-4c3f-a170-2b3fe9797955","Type":"ContainerStarted","Data":"7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.515338 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" event={"ID":"6a0b438f-70fd-4c3f-a170-2b3fe9797955","Type":"ContainerStarted","Data":"5e9ef4aa9afea7d30c6037512d1492e64c2534c3400bafb52d83d2fbbb2ba1b2"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.515985 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.536553 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.551816 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.564166 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.565180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.565221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.565234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.565252 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.565264 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.576658 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.591333 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.607536 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.621545 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.633146 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.647765 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.663486 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.669189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.669260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.669280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.669309 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.669328 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.694453 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.714419 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.727980 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.747159 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.766856 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.772693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.772769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.772789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.772816 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.772834 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.789179 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.805839 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.822860 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.838289 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.855256 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.868360 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.876215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.876278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.876299 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.876331 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.876352 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.980040 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.980104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.980115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.980138 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.980152 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:52Z","lastTransitionTime":"2025-12-02T07:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:52 crc kubenswrapper[4895]: I1202 07:23:52.988701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.988908 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:52 crc kubenswrapper[4895]: E1202 07:23:52.988983 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:23:53.988961587 +0000 UTC m=+45.159821200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.083808 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.083865 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.083874 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.083891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.083901 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.140232 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.140230 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:53 crc kubenswrapper[4895]: E1202 07:23:53.140452 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:53 crc kubenswrapper[4895]: E1202 07:23:53.140819 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.187492 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.187635 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.187662 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.187688 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.187709 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.290441 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.290497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.290514 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.290535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.290550 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.393405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.393470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.393490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.393520 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.393540 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.496711 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.496805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.496824 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.496852 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.496871 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.600143 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.600215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.600233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.600262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.600281 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.702855 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.702935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.702960 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.702994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.703022 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.806696 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.806780 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.806799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.806825 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.806839 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.910831 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.910939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.910966 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.911001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.911021 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:53Z","lastTransitionTime":"2025-12-02T07:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:53 crc kubenswrapper[4895]: I1202 07:23:53.999712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:54 crc kubenswrapper[4895]: E1202 07:23:54.000031 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:54 crc kubenswrapper[4895]: E1202 07:23:54.000171 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:23:56.000137836 +0000 UTC m=+47.170997479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.014313 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.014370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.014387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.014413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.014432 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.117659 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.117735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.117824 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.117860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.117887 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.140987 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.141036 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:54 crc kubenswrapper[4895]: E1202 07:23:54.141133 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:23:54 crc kubenswrapper[4895]: E1202 07:23:54.141273 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.221264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.221322 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.221340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.221366 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.221391 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.325078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.325159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.325184 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.325214 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.325243 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.428004 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.428127 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.428148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.428178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.428237 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.531068 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.531145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.531162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.531187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.531203 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.635063 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.635128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.635155 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.635188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.635210 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.739412 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.739481 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.739504 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.739538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.739570 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.843422 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.843522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.843551 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.843587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.843613 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.946663 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.946789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.946809 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.946837 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:54 crc kubenswrapper[4895]: I1202 07:23:54.946857 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:54Z","lastTransitionTime":"2025-12-02T07:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.050907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.050997 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.051021 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.051060 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.051087 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.140480 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.140651 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:55 crc kubenswrapper[4895]: E1202 07:23:55.140869 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:55 crc kubenswrapper[4895]: E1202 07:23:55.141109 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.153457 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.153516 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.153535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.153562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.153579 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.258104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.258187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.258211 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.258241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.258259 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.361792 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.361886 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.361920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.361954 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.361977 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.466166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.466244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.466263 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.466292 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.466310 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.569635 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.569712 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.569731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.569796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.569850 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.673470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.673554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.673580 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.673616 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.673643 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.777967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.778055 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.778080 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.778113 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.778135 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.882001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.882085 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.882105 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.882139 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.882162 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.985668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.986046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.986083 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.986115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:55 crc kubenswrapper[4895]: I1202 07:23:55.986136 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:55Z","lastTransitionTime":"2025-12-02T07:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.024212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.024397 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.024509 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:24:00.024487471 +0000 UTC m=+51.195347084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.089123 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.089202 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.089223 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.089256 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.089274 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.141040 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.141068 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.141333 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.141467 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.193535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.193606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.193624 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.193658 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.193680 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.297021 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.297107 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.297130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.297162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.297186 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.400664 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.400735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.400785 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.400820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.400845 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.503704 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.503873 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.503953 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.503988 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.504011 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.607927 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.607997 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.608020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.608046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.608065 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.712108 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.712182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.712206 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.712238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.712259 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.819215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.819296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.819321 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.819357 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.819379 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.842699 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.849131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.849217 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.849240 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.849273 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.849299 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.869131 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.875793 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.875849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.875871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.875902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.875921 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.895337 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.901668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.901728 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.901784 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.901813 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.901837 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.918596 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.925063 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.925128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.925150 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.925178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.925203 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.955479 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:56 crc kubenswrapper[4895]: E1202 07:23:56.955730 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.958109 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.958147 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.958161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.958184 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:56 crc kubenswrapper[4895]: I1202 07:23:56.958199 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:56Z","lastTransitionTime":"2025-12-02T07:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.061945 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.062138 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.062169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.062201 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.062227 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.140579 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.140586 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:57 crc kubenswrapper[4895]: E1202 07:23:57.140887 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:57 crc kubenswrapper[4895]: E1202 07:23:57.140993 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.165775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.165843 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.165857 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.165885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.165901 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.269940 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.270107 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.270133 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.270164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.270186 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.373639 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.373795 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.373822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.373859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.373882 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.477019 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.477534 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.477732 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.477986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.478168 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.581620 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.581707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.581736 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.581810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.581838 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.686140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.686216 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.686237 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.686270 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.686296 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.790468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.790537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.790560 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.790591 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.790611 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.894278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.894359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.894383 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.894416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.894441 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.998775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.998848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.998872 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.998898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:57 crc kubenswrapper[4895]: I1202 07:23:57.998917 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:57Z","lastTransitionTime":"2025-12-02T07:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.003078 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.022453 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.028613 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.052465 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.085405 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.101874 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.101942 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.101961 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.101989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.102009 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.108383 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.129843 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.140070 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:23:58 crc kubenswrapper[4895]: E1202 07:23:58.140311 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.140456 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:23:58 crc kubenswrapper[4895]: E1202 07:23:58.140966 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.148299 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.168436 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.189104 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.205327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.205367 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.205378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.205396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.205409 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.209314 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.228685 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.248652 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.266096 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.280860 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.292333 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.302010 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.308614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.308693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.308721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.308789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.308817 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.316400 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.412103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.412195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.412217 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.412249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.412268 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.516403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.516459 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.516471 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.516493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.516510 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.619332 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.619381 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.619391 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.619412 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.619422 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.722810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.722888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.722912 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.722945 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.722974 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.826386 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.826432 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.826447 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.826465 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.826480 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.929677 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.929730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.929763 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.929784 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:58 crc kubenswrapper[4895]: I1202 07:23:58.929797 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:58Z","lastTransitionTime":"2025-12-02T07:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.032601 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.032662 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.032679 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.032707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.032724 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.136036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.136087 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.136101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.136122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.136135 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.140552 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:23:59 crc kubenswrapper[4895]: E1202 07:23:59.140663 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.140890 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:23:59 crc kubenswrapper[4895]: E1202 07:23:59.142256 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.157121 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.169662 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.184969 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.196553 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.209002 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.226088 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.238202 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.238260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.238276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.238297 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.238309 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.247239 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.260133 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.273106 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.286899 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.302698 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.314716 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.333116 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.340920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.340957 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.340970 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.340987 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.340998 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.344949 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.361369 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.377133 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.394355 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:23:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.444145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.444186 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.444195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.444212 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.444224 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.546928 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.546967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.546976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.546993 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.547003 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.650649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.650702 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.650719 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.650779 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.650797 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.754650 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.754706 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.754718 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.754766 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.754782 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.857900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.857932 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.857943 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.857978 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.857990 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.960795 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.960888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.960907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.960935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:23:59 crc kubenswrapper[4895]: I1202 07:23:59.960955 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:23:59Z","lastTransitionTime":"2025-12-02T07:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.064599 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.064653 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.064669 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.064691 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.064706 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.074272 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:00 crc kubenswrapper[4895]: E1202 07:24:00.074484 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:00 crc kubenswrapper[4895]: E1202 07:24:00.074623 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:24:08.074595985 +0000 UTC m=+59.245455608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.140546 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:00 crc kubenswrapper[4895]: E1202 07:24:00.140722 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.140846 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:00 crc kubenswrapper[4895]: E1202 07:24:00.140948 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.168778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.168832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.168849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.168873 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.168890 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.279241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.279314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.279334 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.279363 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.279385 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.383344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.383416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.383434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.383465 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.383500 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.487482 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.487557 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.487575 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.487603 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.487626 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.590405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.590518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.590542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.590573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.590593 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.693986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.694059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.694078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.694107 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.694127 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.797648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.797808 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.797830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.797858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.797880 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.901976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.902108 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.902135 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.902169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:00 crc kubenswrapper[4895]: I1202 07:24:00.902191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:00Z","lastTransitionTime":"2025-12-02T07:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.006318 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.006401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.006421 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.006456 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.006477 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.109969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.110058 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.110083 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.110118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.110143 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.140652 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.140728 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:01 crc kubenswrapper[4895]: E1202 07:24:01.140932 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:01 crc kubenswrapper[4895]: E1202 07:24:01.141207 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.213344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.213413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.213430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.213454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.213467 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.317046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.317113 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.317132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.317161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.317185 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.420573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.420640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.420658 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.420686 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.420702 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.524693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.524792 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.524818 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.524850 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.524875 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.627809 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.627883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.627895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.627914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.627927 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.732483 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.732554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.732577 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.732605 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.732625 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.836698 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.836777 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.836794 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.836824 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.836844 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.940093 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.940169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.940189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.940219 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:01 crc kubenswrapper[4895]: I1202 07:24:01.940239 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:01Z","lastTransitionTime":"2025-12-02T07:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.045101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.045177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.045196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.045228 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.045253 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.140719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.140719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:02 crc kubenswrapper[4895]: E1202 07:24:02.141476 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:02 crc kubenswrapper[4895]: E1202 07:24:02.141622 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.141928 4895 scope.go:117] "RemoveContainer" containerID="370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.148689 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.148735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.148781 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.148805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.148823 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.262578 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.263241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.263268 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.263301 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.263321 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.366778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.366836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.366855 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.366883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.366904 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.471023 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.471088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.471114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.471145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.471173 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.561390 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/1.log" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.566187 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.566963 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.573372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.573417 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.573430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.573448 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.573462 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.593535 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.617313 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.638128 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.659066 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.675888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.675937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.675950 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.675970 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.675983 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.686060 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.707601 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.736106 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.757048 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.774348 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.778501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.778547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.778558 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.778578 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.778592 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.787070 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.801694 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.816169 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.831243 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.844732 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.862390 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.878376 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.881454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.881513 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.881527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.881553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.881566 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.904148 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.985244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.985344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.985372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.985410 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:02 crc kubenswrapper[4895]: I1202 07:24:02.985437 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:02Z","lastTransitionTime":"2025-12-02T07:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.088549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.088609 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.088622 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.088648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.088663 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.141054 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.141113 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:03 crc kubenswrapper[4895]: E1202 07:24:03.141285 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:03 crc kubenswrapper[4895]: E1202 07:24:03.141521 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.191013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.191078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.191093 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.191114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.191129 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.294668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.294783 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.294806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.294836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.294856 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.399842 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.399926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.399952 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.399986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.400010 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.502848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.504021 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.504187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.504314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.504460 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.572317 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/2.log" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.573899 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/1.log" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.577016 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" exitCode=1 Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.577083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.577202 4895 scope.go:117] "RemoveContainer" containerID="370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.579382 4895 scope.go:117] "RemoveContainer" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" Dec 02 07:24:03 crc kubenswrapper[4895]: E1202 07:24:03.579867 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.600629 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.609102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.609351 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.609593 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.609775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.610000 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.624682 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.639717 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.653715 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.675525 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.694045 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.723001 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.726203 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.726243 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.726260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.726285 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.726303 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.750175 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.785545 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.802558 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.815804 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829688 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829781 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829852 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829874 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.829887 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.842214 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.854156 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.868100 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.885571 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.917778 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://370f4392aff2eb7ec2d34d9535b978f2354692fcc0945eae167d222b1d97024d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:23:49Z\\\",\\\"message\\\":\\\"Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.673830 6315 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1202 07:23:49.676921 6315 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1202 07:23:49.676925 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-wfcg7 after 0 failed attempt(s)\\\\nI1202 07:23:49.676901 6315 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:03Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.932909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.932967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.932985 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.933011 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:03 crc kubenswrapper[4895]: I1202 07:24:03.933028 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:03Z","lastTransitionTime":"2025-12-02T07:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.036691 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.036780 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.036793 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.036814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.036829 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140074 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140105 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140298 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140312 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140332 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.140319 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.140351 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.140478 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.243231 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.243286 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.243299 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.243317 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.243330 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.346102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.346194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.346220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.346251 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.346274 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.449521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.449597 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.449613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.449638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.449654 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.552692 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.552758 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.552775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.552795 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.552808 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.591880 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/2.log" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.598608 4895 scope.go:117] "RemoveContainer" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.598956 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.617632 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.635955 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.656411 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.656517 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.656580 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.656615 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.656674 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.659945 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.680364 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.702206 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.722380 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.741611 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.761323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.761368 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.761380 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.761401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.761415 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.769101 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.792137 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.815979 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.840079 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.861456 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.864042 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.864137 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.864164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.864199 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.864228 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.882588 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.901845 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.927337 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.951403 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.961584 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.961662 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.961792 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.961903 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:36.96187085 +0000 UTC m=+88.132730503 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.961938 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:24:04 crc kubenswrapper[4895]: E1202 07:24:04.962025 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:36.961998894 +0000 UTC m=+88.132858537 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.967494 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.967562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.967578 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.967604 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.967620 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:04Z","lastTransitionTime":"2025-12-02T07:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:04 crc kubenswrapper[4895]: I1202 07:24:04.972835 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:04Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.062351 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.062517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062627 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:24:37.062593693 +0000 UTC m=+88.233453346 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.062688 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062700 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062861 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062874 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062916 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:37.062905543 +0000 UTC m=+88.233765156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.062764 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.063045 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.063089 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.063226 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:24:37.06318137 +0000 UTC m=+88.234041033 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.070885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.070958 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.070978 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.071007 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.071028 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.140645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.140732 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.140878 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:05 crc kubenswrapper[4895]: E1202 07:24:05.141006 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.173946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.174027 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.174048 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.174084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.174106 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.277838 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.277928 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.277957 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.277995 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.278018 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.381497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.381565 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.381586 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.381612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.381633 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.485443 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.485526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.485545 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.485569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.485587 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.590166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.590238 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.590256 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.590287 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.590310 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.693466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.693524 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.693541 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.693564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.693578 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.797394 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.797466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.797490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.797530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.797550 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.901306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.901372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.901401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.901438 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:05 crc kubenswrapper[4895]: I1202 07:24:05.901466 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:05Z","lastTransitionTime":"2025-12-02T07:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.004359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.004441 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.004458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.004487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.004508 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.108314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.108384 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.108402 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.108429 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.108450 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.140885 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.140998 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:06 crc kubenswrapper[4895]: E1202 07:24:06.141148 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:06 crc kubenswrapper[4895]: E1202 07:24:06.141289 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.212175 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.212259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.212276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.212296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.212310 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.315289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.315370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.315399 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.315435 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.315455 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.418830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.418902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.418921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.418951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.418972 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.522898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.522980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.523001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.523037 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.523056 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.626587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.626676 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.626697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.626732 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.626804 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.731463 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.731547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.731575 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.731611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.731634 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.835030 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.835123 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.835143 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.835173 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.835192 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.938640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.938786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.938810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.938843 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:06 crc kubenswrapper[4895]: I1202 07:24:06.938865 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:06Z","lastTransitionTime":"2025-12-02T07:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.042948 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.043048 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.043069 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.043102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.043126 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.077485 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.077556 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.077573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.077601 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.077620 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.103295 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.109847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.109908 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.109927 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.109955 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.109981 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.132495 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.138232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.138288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.138306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.138336 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.138357 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.140598 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.140627 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.140908 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.141160 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.167480 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.174898 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.174977 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.175005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.175038 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.175061 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.202584 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.209095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.209168 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.209189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.209220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.209244 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.228992 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:07 crc kubenswrapper[4895]: E1202 07:24:07.229253 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.232849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.232930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.232958 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.232989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.233010 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.336712 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.336856 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.336877 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.336906 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.336925 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.440982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.441043 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.441057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.441078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.441091 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.545344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.545426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.545445 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.545477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.545496 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.650061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.650165 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.650194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.650233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.650257 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.754596 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.754675 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.754695 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.754725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.754771 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.858901 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.859026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.859054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.859091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.859115 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.963452 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.963553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.963574 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.963606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:07 crc kubenswrapper[4895]: I1202 07:24:07.963629 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:07Z","lastTransitionTime":"2025-12-02T07:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.067054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.067139 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.067170 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.067202 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.067221 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.096674 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:08 crc kubenswrapper[4895]: E1202 07:24:08.097072 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:08 crc kubenswrapper[4895]: E1202 07:24:08.097166 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:24:24.09713868 +0000 UTC m=+75.267998333 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.140555 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.140599 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:08 crc kubenswrapper[4895]: E1202 07:24:08.140841 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:08 crc kubenswrapper[4895]: E1202 07:24:08.140982 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.171321 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.171376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.171397 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.171427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.171448 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.274359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.274403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.274446 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.274466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.274479 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.378156 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.378222 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.378241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.378272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.378292 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.481841 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.481996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.482018 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.482118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.482191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.586215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.586272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.586289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.586311 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.586324 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.690694 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.690785 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.690799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.690821 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.690842 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.795684 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.795814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.795844 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.795873 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.795893 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.898866 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.898956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.898985 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.899024 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:08 crc kubenswrapper[4895]: I1202 07:24:08.899048 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:08Z","lastTransitionTime":"2025-12-02T07:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.001389 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.001493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.001526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.001562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.001588 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.106839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.106951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.106978 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.107059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.107080 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.141900 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.141949 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:09 crc kubenswrapper[4895]: E1202 07:24:09.142212 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:09 crc kubenswrapper[4895]: E1202 07:24:09.142462 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.168657 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.185881 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.209334 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.209375 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.209387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.209407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.209420 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.220003 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.242665 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.260300 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.271907 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.287051 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.300151 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.312514 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.312554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.312564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.312585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.312598 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.315561 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.330590 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.346171 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.364003 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.377986 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.394599 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.411610 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.415293 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.415341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.415354 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.415377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.415391 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.425447 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.439361 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:09Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.519070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.519145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.519166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.519197 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.519218 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.622542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.622595 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.622609 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.622629 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.622643 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.726413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.726496 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.726521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.726560 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.726583 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.830269 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.830346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.830366 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.830396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.830419 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.934229 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.934285 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.934297 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.934318 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:09 crc kubenswrapper[4895]: I1202 07:24:09.934333 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:09Z","lastTransitionTime":"2025-12-02T07:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.038119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.038186 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.038204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.038234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.038262 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.140180 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.140670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:10 crc kubenswrapper[4895]: E1202 07:24:10.141300 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:10 crc kubenswrapper[4895]: E1202 07:24:10.140834 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.141912 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.141965 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.141982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.142005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.142026 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.245541 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.245915 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.245983 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.246052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.246207 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.350180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.350273 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.350291 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.350316 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.350336 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.453284 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.453820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.454082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.454314 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.454526 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.558349 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.558407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.558426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.558454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.558473 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.661967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.662032 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.662051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.662077 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.662095 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.765142 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.765617 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.765726 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.765871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.765957 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.869799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.869868 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.869879 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.869897 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.869933 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.974099 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.974178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.974200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.974229 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:10 crc kubenswrapper[4895]: I1202 07:24:10.974248 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:10Z","lastTransitionTime":"2025-12-02T07:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.077589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.077660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.077678 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.077703 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.077720 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.140942 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:11 crc kubenswrapper[4895]: E1202 07:24:11.141098 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.141388 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:11 crc kubenswrapper[4895]: E1202 07:24:11.141792 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.181234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.181301 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.181319 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.181340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.181352 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.284395 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.284470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.284488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.284518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.284535 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.387956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.388041 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.388061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.388087 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.388102 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.491685 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.491725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.491737 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.491774 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.491784 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.595471 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.595574 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.595603 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.595637 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.595663 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.698472 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.698533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.698551 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.698572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.698588 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.801457 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.801493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.801504 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.801520 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.801530 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.904636 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.904724 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.904850 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.904882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:11 crc kubenswrapper[4895]: I1202 07:24:11.904900 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:11Z","lastTransitionTime":"2025-12-02T07:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.008770 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.008813 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.008821 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.008838 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.008847 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.111502 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.111536 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.111546 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.111561 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.111569 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.140288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.140302 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:12 crc kubenswrapper[4895]: E1202 07:24:12.140412 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:12 crc kubenswrapper[4895]: E1202 07:24:12.140581 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.214710 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.214772 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.214793 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.214817 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.214830 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.318174 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.318222 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.318233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.318252 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.318263 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.421560 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.421652 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.421673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.421705 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.421728 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.524998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.525073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.525097 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.525132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.525176 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.628537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.628579 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.628589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.628606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.628619 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.730877 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.730913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.730924 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.730942 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.730951 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.833801 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.833847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.833858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.833877 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.833891 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.936630 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.936716 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.936737 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.936831 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:12 crc kubenswrapper[4895]: I1202 07:24:12.936861 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:12Z","lastTransitionTime":"2025-12-02T07:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.040219 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.040268 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.040279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.040300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.040311 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.140535 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.140678 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:13 crc kubenswrapper[4895]: E1202 07:24:13.140693 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:13 crc kubenswrapper[4895]: E1202 07:24:13.140888 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.142936 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.142972 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.142982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.143039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.143052 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.246798 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.246871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.246894 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.246938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.246967 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.350723 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.350830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.350849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.350876 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.350894 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.453230 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.453265 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.453275 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.453292 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.453302 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.555390 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.555499 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.555514 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.555537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.555549 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.663506 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.663558 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.663568 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.663591 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.663603 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.766858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.766895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.766906 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.766921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.766931 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.869584 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.869645 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.869660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.869683 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.869697 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.972918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.972968 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.972980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.973001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:13 crc kubenswrapper[4895]: I1202 07:24:13.973015 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:13Z","lastTransitionTime":"2025-12-02T07:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.075378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.075423 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.075433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.075450 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.075461 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.141126 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.141152 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:14 crc kubenswrapper[4895]: E1202 07:24:14.141348 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:14 crc kubenswrapper[4895]: E1202 07:24:14.141420 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.178996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.179091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.179118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.179158 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.179191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.282066 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.282107 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.282119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.282136 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.282148 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.385616 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.385704 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.385725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.385810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.385835 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.489175 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.489283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.489300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.489346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.489359 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.592057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.592129 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.592140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.592158 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.592170 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.695182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.695232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.695242 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.695262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.695273 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.797926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.797976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.797988 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.798009 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.798022 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.900015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.900068 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.900078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.900097 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:14 crc kubenswrapper[4895]: I1202 07:24:14.900108 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:14Z","lastTransitionTime":"2025-12-02T07:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.002999 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.003049 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.003099 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.003122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.003134 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.106127 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.106184 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.106201 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.106227 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.106244 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.140503 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.140485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:15 crc kubenswrapper[4895]: E1202 07:24:15.140659 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:15 crc kubenswrapper[4895]: E1202 07:24:15.140977 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.209961 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.210028 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.210051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.210086 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.210109 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.313092 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.313145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.313159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.313179 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.313192 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.416560 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.416616 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.416634 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.416660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.416678 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.520282 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.520327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.520337 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.520358 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.520370 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.623046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.623088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.623100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.623121 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.623133 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.725679 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.725734 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.725786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.725811 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.725827 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.828590 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.828657 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.828676 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.828706 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.828727 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.931164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.931230 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.931249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.931278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:15 crc kubenswrapper[4895]: I1202 07:24:15.931296 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:15Z","lastTransitionTime":"2025-12-02T07:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.034759 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.034807 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.034820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.034839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.034851 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.137477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.137517 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.137529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.137546 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.137559 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.141102 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.141152 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:16 crc kubenswrapper[4895]: E1202 07:24:16.141318 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:16 crc kubenswrapper[4895]: E1202 07:24:16.141517 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.241163 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.241206 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.241217 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.241235 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.241246 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.344359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.344409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.344420 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.344439 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.344450 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.447030 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.447075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.447086 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.447103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.447114 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.550618 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.550709 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.550735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.550820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.550859 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.654397 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.654461 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.654478 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.654505 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.654523 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.757710 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.757878 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.757902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.757935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.757955 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.860220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.860279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.860294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.860315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.860330 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.962860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.962943 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.962967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.962997 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:16 crc kubenswrapper[4895]: I1202 07:24:16.963016 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:16Z","lastTransitionTime":"2025-12-02T07:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.065778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.065821 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.065832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.065868 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.065880 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.140391 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.140443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.140543 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.140716 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.168986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.169063 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.169082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.169109 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.169129 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.272279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.272376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.272400 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.272432 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.272457 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.359679 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.359761 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.359780 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.359806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.359826 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.380411 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.384362 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.384403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.384418 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.384438 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.384450 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.395853 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.399864 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.399904 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.399921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.399944 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.399958 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.411757 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.414826 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.414870 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.414885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.414909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.414922 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.426647 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.431513 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.431559 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.431569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.431586 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.431597 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.445762 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:17 crc kubenswrapper[4895]: E1202 07:24:17.445961 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.447717 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.447772 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.447788 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.447806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.447818 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.550698 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.550762 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.550775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.550792 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.550802 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.653553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.653599 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.653609 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.653626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.653638 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.757084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.757154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.757165 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.757182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.757191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.859816 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.859944 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.859968 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.860000 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.860022 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.962243 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.962289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.962300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.962318 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:17 crc kubenswrapper[4895]: I1202 07:24:17.962330 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:17Z","lastTransitionTime":"2025-12-02T07:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.065382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.065424 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.065432 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.065451 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.065461 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.140460 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.140458 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:18 crc kubenswrapper[4895]: E1202 07:24:18.140619 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:18 crc kubenswrapper[4895]: E1202 07:24:18.140794 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.167835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.167870 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.167882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.167900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.167911 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.270483 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.270527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.270538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.270554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.270566 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.373303 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.373349 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.373359 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.373376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.373388 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.474909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.474946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.474956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.474972 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.474984 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.577798 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.577846 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.577855 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.577873 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.577885 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.680207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.680261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.680271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.680290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.680302 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.783060 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.783131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.783147 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.783169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.783182 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.886466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.886532 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.886551 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.886580 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.886602 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.989347 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.989380 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.989387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.989401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:18 crc kubenswrapper[4895]: I1202 07:24:18.989410 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:18Z","lastTransitionTime":"2025-12-02T07:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.092487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.092532 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.092543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.092559 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.092571 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.140767 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:19 crc kubenswrapper[4895]: E1202 07:24:19.140884 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.141119 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:19 crc kubenswrapper[4895]: E1202 07:24:19.141183 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.157855 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.184151 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.195323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.195382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.195396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.195422 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.195435 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.196838 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.216796 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.229841 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.248511 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.264362 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.276093 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.292977 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.297751 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.297777 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.297787 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.297810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.297821 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.305182 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.332513 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.355803 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.374622 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.389780 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.400634 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.400690 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.400704 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.400727 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.400765 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.408161 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.426054 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.440770 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.504215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.504281 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.504295 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.504315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.504326 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.607002 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.607060 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.607075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.607098 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.607111 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.711274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.711349 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.711371 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.711402 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.711425 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.814567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.814617 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.814628 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.814650 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.814665 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.917259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.917292 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.917302 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.917319 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:19 crc kubenswrapper[4895]: I1202 07:24:19.917331 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:19Z","lastTransitionTime":"2025-12-02T07:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.020674 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.020777 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.020804 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.020836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.020856 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.123589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.123868 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.123956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.124035 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.124106 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.141174 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.141198 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:20 crc kubenswrapper[4895]: E1202 07:24:20.141347 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:20 crc kubenswrapper[4895]: E1202 07:24:20.142200 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.143182 4895 scope.go:117] "RemoveContainer" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" Dec 02 07:24:20 crc kubenswrapper[4895]: E1202 07:24:20.143492 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.227066 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.227142 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.227167 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.227191 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.227209 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.329947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.330322 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.330468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.330894 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.331045 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.433809 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.433867 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.433883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.433907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.433922 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.537022 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.537434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.537613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.537861 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.538044 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.641869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.641943 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.641966 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.641989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.642002 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.744531 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.744584 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.744600 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.744623 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.744639 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.846629 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.846667 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.846680 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.846700 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.846713 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.949900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.949986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.950010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.950047 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:20 crc kubenswrapper[4895]: I1202 07:24:20.950069 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:20Z","lastTransitionTime":"2025-12-02T07:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.052896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.052951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.052969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.052998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.053019 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.141105 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.141145 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:21 crc kubenswrapper[4895]: E1202 07:24:21.141870 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:21 crc kubenswrapper[4895]: E1202 07:24:21.142005 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.155197 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.155264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.155287 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.155315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.155337 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.258221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.258526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.258672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.258909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.259061 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.361002 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.361041 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.361050 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.361070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.361081 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.463151 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.463208 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.463221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.463243 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.463259 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.567140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.567194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.567220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.567241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.567253 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.669496 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.669543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.669552 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.669570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.669579 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.773002 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.773075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.773098 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.773128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.773148 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.875975 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.876046 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.876059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.876087 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.876105 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.979196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.979245 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.979255 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.979272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:21 crc kubenswrapper[4895]: I1202 07:24:21.979283 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:21Z","lastTransitionTime":"2025-12-02T07:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.082078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.082119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.082128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.082148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.082160 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.140844 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.140844 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:22 crc kubenswrapper[4895]: E1202 07:24:22.141075 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:22 crc kubenswrapper[4895]: E1202 07:24:22.141229 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.185167 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.185263 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.185277 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.185319 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.185333 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.287113 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.287159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.287168 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.287185 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.287196 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.389312 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.389413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.389433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.389493 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.389515 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.492052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.492125 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.492148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.492174 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.492188 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.594939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.594995 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.595014 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.595035 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.595051 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.697670 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.697754 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.697768 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.697791 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.697803 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.800092 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.800144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.800157 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.800177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.800189 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.902463 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.902511 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.902562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.902588 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:22 crc kubenswrapper[4895]: I1202 07:24:22.902601 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:22Z","lastTransitionTime":"2025-12-02T07:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.005561 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.005812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.005874 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.005897 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.005907 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.109724 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.109814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.109832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.109859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.109941 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.140959 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.141018 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:23 crc kubenswrapper[4895]: E1202 07:24:23.141180 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:23 crc kubenswrapper[4895]: E1202 07:24:23.141446 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.212784 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.212835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.212848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.212867 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.212879 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.316043 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.316119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.316144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.316177 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.316201 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.419341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.419403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.419420 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.419450 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.419468 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.522415 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.522454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.522466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.522483 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.522495 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.624848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.624891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.624900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.624914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.624923 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.727865 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.727908 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.727919 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.727937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.727948 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.830530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.830572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.830582 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.830598 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.830608 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.933183 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.933220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.933230 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.933246 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:23 crc kubenswrapper[4895]: I1202 07:24:23.933256 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:23Z","lastTransitionTime":"2025-12-02T07:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.035461 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.035498 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.035506 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.035522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.035534 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.098517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:24 crc kubenswrapper[4895]: E1202 07:24:24.098716 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:24 crc kubenswrapper[4895]: E1202 07:24:24.098818 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:24:56.098796832 +0000 UTC m=+107.269656445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.137727 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.137800 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.137810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.137825 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.137836 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.141216 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.141293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:24 crc kubenswrapper[4895]: E1202 07:24:24.141386 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:24 crc kubenswrapper[4895]: E1202 07:24:24.141494 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.243627 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.243672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.243683 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.243702 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.243713 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.346370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.346427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.346448 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.346474 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.346543 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.448667 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.448709 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.448726 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.448773 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.448789 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.551964 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.552032 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.552056 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.552087 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.552108 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.655180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.655219 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.655232 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.655251 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.655263 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.758140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.758220 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.758249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.758284 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.758310 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.862010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.862119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.862145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.862180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.862210 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.966998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.967072 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.967094 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.967159 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:24 crc kubenswrapper[4895]: I1202 07:24:24.967402 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:24Z","lastTransitionTime":"2025-12-02T07:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.071114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.071164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.071173 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.071193 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.071204 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.140145 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:25 crc kubenswrapper[4895]: E1202 07:24:25.140299 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.140435 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:25 crc kubenswrapper[4895]: E1202 07:24:25.140701 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.175290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.175342 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.175352 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.175397 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.175408 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.278836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.278911 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.278933 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.278965 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.278983 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.382872 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.382926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.382936 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.382955 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.382967 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.486543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.486627 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.486653 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.486683 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.486703 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.597649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.597725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.597827 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.598346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.598397 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.683560 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/0.log" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.683649 4895 generic.go:334] "Generic (PLEG): container finished" podID="30911fe5-208f-44e8-a380-2a0093f24863" containerID="87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1" exitCode=1 Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.683732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerDied","Data":"87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.686040 4895 scope.go:117] "RemoveContainer" containerID="87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.701011 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.701067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.701082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.701104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.701120 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.712690 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.735113 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.758686 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.785952 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.805409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.805479 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.805501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.805532 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.805558 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.809249 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.828258 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.848176 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.874341 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.894150 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.909211 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.909264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.909282 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.909310 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.909330 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:25Z","lastTransitionTime":"2025-12-02T07:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.914013 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.937481 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.958771 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:25 crc kubenswrapper[4895]: I1202 07:24:25.994942 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.012962 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.013035 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.013056 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.013082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.013105 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.019437 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.038267 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.054649 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.071985 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.116920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.117239 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.117366 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.117518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.117664 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.140590 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.140818 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:26 crc kubenswrapper[4895]: E1202 07:24:26.140983 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:26 crc kubenswrapper[4895]: E1202 07:24:26.141036 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.221505 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.221564 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.221581 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.221608 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.221624 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.325582 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.325661 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.325682 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.325710 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.325729 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.429460 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.429523 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.429544 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.429570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.429591 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.533427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.533497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.533520 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.533552 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.533576 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.636487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.636584 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.636602 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.636632 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.636649 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.690887 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/0.log" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.690952 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerStarted","Data":"a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.715885 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.738597 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.740567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.740637 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.740657 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.740685 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.740705 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.774117 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.797572 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.815570 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.831651 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.844607 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.844687 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.844711 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.844800 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.844825 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.849713 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.866800 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.885273 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.902270 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.921650 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.939991 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.947637 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.947675 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.947693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.947719 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.947737 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:26Z","lastTransitionTime":"2025-12-02T07:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.962402 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.976836 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:26 crc kubenswrapper[4895]: I1202 07:24:26.991688 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:26Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.004378 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.016348 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.050849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.050901 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.050913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.050935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.050949 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.140525 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.140970 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.141088 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.141290 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.153832 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.153914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.153934 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.154391 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.154456 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.256869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.256921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.256932 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.256951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.256961 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.359438 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.359507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.359526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.359556 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.359576 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.463689 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.463813 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.463829 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.463848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.463857 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.567888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.567970 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.567999 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.568040 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.568062 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.672137 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.672244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.672265 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.672289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.672303 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.759114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.759204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.759226 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.759255 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.759274 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.779016 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.785301 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.785376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.785395 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.785423 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.785456 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.809425 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.815427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.815497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.815519 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.815550 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.815568 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.839489 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.845859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.845913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.845927 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.845949 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.845963 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.868195 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.874864 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.874929 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.874949 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.874979 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.874999 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.897026 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:27Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:27 crc kubenswrapper[4895]: E1202 07:24:27.897251 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.899525 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.899573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.899589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.899612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:27 crc kubenswrapper[4895]: I1202 07:24:27.899628 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:27Z","lastTransitionTime":"2025-12-02T07:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.003926 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.004002 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.004017 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.004043 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.004057 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.107938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.108009 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.108026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.108051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.108072 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.140258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:28 crc kubenswrapper[4895]: E1202 07:24:28.140498 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.141020 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:28 crc kubenswrapper[4895]: E1202 07:24:28.141139 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.212299 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.212365 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.212390 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.212416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.212434 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.315357 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.315416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.315435 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.315467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.315487 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.419088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.419153 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.419173 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.419200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.419219 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.522876 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.522979 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.523015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.523055 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.523081 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.627118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.627191 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.627209 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.627236 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.627255 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.732158 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.732234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.732259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.732298 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.732327 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.836776 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.836879 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.836906 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.836950 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.836979 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.941210 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.941321 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.941347 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.941387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:28 crc kubenswrapper[4895]: I1202 07:24:28.941415 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:28Z","lastTransitionTime":"2025-12-02T07:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.044626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.044721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.044806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.044844 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.044864 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.140448 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.140530 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:29 crc kubenswrapper[4895]: E1202 07:24:29.140733 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:29 crc kubenswrapper[4895]: E1202 07:24:29.140962 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.148440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.148497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.148516 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.148543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.148562 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.163642 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.185843 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.207562 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.226237 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.251489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.251875 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.252014 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.252144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.252247 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.264489 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.285622 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.311293 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.332768 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.353154 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.355264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.355295 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.355307 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.355325 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.355338 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.374346 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.394265 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.413355 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.431360 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.445758 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.458972 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.459030 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.459068 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.459098 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.459115 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.464965 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.481925 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.496025 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:29Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.561698 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.561828 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.561855 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.561888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.561913 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.665827 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.665903 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.665920 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.665946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.665963 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.769695 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.769787 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.769804 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.769825 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.769843 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.873058 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.873119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.873140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.873166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.873185 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.976882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.976941 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.976954 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.976976 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:29 crc kubenswrapper[4895]: I1202 07:24:29.976990 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:29Z","lastTransitionTime":"2025-12-02T07:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.081290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.081883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.082130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.082372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.082603 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.140113 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.140132 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:30 crc kubenswrapper[4895]: E1202 07:24:30.140675 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:30 crc kubenswrapper[4895]: E1202 07:24:30.140929 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.186414 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.186486 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.186507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.186537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.186573 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.290607 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.290678 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.290698 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.290730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.290779 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.395945 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.396434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.396627 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.396899 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.397132 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.500883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.500947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.500966 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.500994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.501014 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.604971 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.605063 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.605093 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.605128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.605158 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.707806 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.707895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.707921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.707981 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.708010 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.811801 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.811882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.811908 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.811939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.811966 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.915515 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.915567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.915576 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.915600 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:30 crc kubenswrapper[4895]: I1202 07:24:30.915611 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:30Z","lastTransitionTime":"2025-12-02T07:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.019260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.019364 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.019376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.019398 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.019409 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.122881 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.122969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.123029 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.123061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.123116 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.144695 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.144712 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:31 crc kubenswrapper[4895]: E1202 07:24:31.145011 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:31 crc kubenswrapper[4895]: E1202 07:24:31.145194 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.227545 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.227609 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.227628 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.227657 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.227674 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.330763 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.330817 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.330828 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.330847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.330858 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.434470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.434529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.434545 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.434570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.434590 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.537471 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.537801 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.537893 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.538001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.538094 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.641700 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.641809 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.641828 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.641854 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.641873 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.744520 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.744821 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.744925 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.745047 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.745218 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.848050 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.848121 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.848133 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.848180 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.848192 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.951852 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.951917 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.951939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.951967 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:31 crc kubenswrapper[4895]: I1202 07:24:31.951986 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:31Z","lastTransitionTime":"2025-12-02T07:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.055951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.056343 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.056511 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.056660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.056848 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.140444 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:32 crc kubenswrapper[4895]: E1202 07:24:32.140640 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.140480 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:32 crc kubenswrapper[4895]: E1202 07:24:32.141667 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.160114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.160435 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.160567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.160701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.160876 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.263593 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.264240 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.264420 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.264531 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.264625 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.368455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.368785 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.368938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.369033 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.369125 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.473082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.473396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.473495 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.473626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.473777 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.576700 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.576783 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.576800 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.576823 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.576837 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.680291 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.680345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.680368 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.680392 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.680410 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.783476 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.783514 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.783527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.783546 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.783560 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.887346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.887435 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.887455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.887486 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.887506 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.990365 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.990446 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.990467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.990486 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:32 crc kubenswrapper[4895]: I1202 07:24:32.990531 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:32Z","lastTransitionTime":"2025-12-02T07:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.093906 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.093958 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.093969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.093986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.093998 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.140262 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.140799 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:33 crc kubenswrapper[4895]: E1202 07:24:33.140999 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:33 crc kubenswrapper[4895]: E1202 07:24:33.141188 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.156026 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.197171 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.197240 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.197255 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.197277 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.197292 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.300986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.301057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.301069 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.301088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.301100 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.404079 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.404132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.404142 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.404165 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.404176 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.507022 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.507095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.507115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.507148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.507169 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.610468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.610523 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.610537 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.610566 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.610579 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.713498 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.713554 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.713566 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.713587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.713599 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.821153 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.821243 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.821261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.821284 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.821303 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.925119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.925166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.925181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.925419 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:33 crc kubenswrapper[4895]: I1202 07:24:33.925455 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:33Z","lastTransitionTime":"2025-12-02T07:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.047169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.047236 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.047258 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.047280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.047294 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.142001 4895 scope.go:117] "RemoveContainer" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.142826 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:34 crc kubenswrapper[4895]: E1202 07:24:34.142971 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.143081 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:34 crc kubenswrapper[4895]: E1202 07:24:34.143185 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.260934 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.261117 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.261139 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.261230 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.261281 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.365387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.365468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.365488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.365519 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.365544 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.468421 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.468468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.468478 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.468496 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.468511 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.572350 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.572426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.572442 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.572469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.572485 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.679044 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.679084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.679096 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.679118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.679131 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.725232 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/2.log" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.727717 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.728303 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.758373 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781101 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781525 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781534 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781551 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.781562 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.794776 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.808543 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.821588 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.833259 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.848525 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.863798 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.878931 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.884571 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.884603 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.884611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.884625 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.884635 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.890350 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.902923 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.917679 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.930874 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.942342 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.958111 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.979444 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:34Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.986427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.986469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.986480 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.986497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:34 crc kubenswrapper[4895]: I1202 07:24:34.986509 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:34Z","lastTransitionTime":"2025-12-02T07:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.006183 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.027550 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.089247 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.089290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.089299 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.089315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.089326 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.140634 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.140771 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:35 crc kubenswrapper[4895]: E1202 07:24:35.140844 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:35 crc kubenswrapper[4895]: E1202 07:24:35.140977 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.192065 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.192117 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.192128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.192143 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.192152 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.294861 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.294907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.294925 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.294944 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.294960 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.397346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.397382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.397393 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.397409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.397423 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.500654 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.500717 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.500731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.500772 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.500790 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.605094 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.605178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.605203 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.605236 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.605255 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.713598 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.713684 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.713707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.713787 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.713818 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.735097 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.736309 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/2.log" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.741799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.741813 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" exitCode=1 Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.741886 4895 scope.go:117] "RemoveContainer" containerID="ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.743556 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:24:35 crc kubenswrapper[4895]: E1202 07:24:35.744023 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.766520 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.782371 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.804672 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.818443 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.818524 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.818544 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.818569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.818589 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.825821 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.839594 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.863438 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.880425 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.896167 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.911524 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.921805 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.921848 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.921896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.921916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.921929 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:35Z","lastTransitionTime":"2025-12-02T07:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.929773 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.956153 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.976378 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:35 crc kubenswrapper[4895]: I1202 07:24:35.994605 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.013513 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.025128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.025160 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.025175 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.025196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.025211 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.032116 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.055959 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.073413 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.103467 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff016284272b93427bb0487a4661973b731200b8b4d28ab1c6d2d512bf5742e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:03Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:24:03.168657 6529 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1202 07:24:03.168679 6529 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 07:24:03.168706 6529 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 07:24:03.168812 6529 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 07:24:03.168811 6529 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:24:03.168829 6529 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:24:03.168889 6529 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:24:03.168921 6529 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:24:03.168943 6529 factory.go:656] Stopping watch factory\\\\nI1202 07:24:03.168956 6529 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:24:03.168969 6529 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 07:24:03.168979 6529 handler.go:208] Removed *v1.Node event handler 7\\\\nI1202 07:24:03.168988 6529 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:24:03.169421 6529 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:24:03.169503 6529 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:24:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:35Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 18 objects: [openshift-multus/multus-hlxqt openshift-multus/network-metrics-daemon-5f88v openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-w54m4 openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-wfcg7 openshift-multus/multus-additional-cni-plugins-n7xcr openshift-network-diagnostics/network-check-target-xd92c openshift-kube-controller-manager/kube-controller-manager-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p openshift-image-registry/node-ca-lhpbd openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-74fkh]\\\\nF1202 07:24:35.228140 6919 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.129348 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.129642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.129853 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.129994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.130152 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.141134 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.141251 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.141392 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.142007 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.166551 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.233666 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.233784 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.233816 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.233846 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.233871 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.337969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.338037 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.338138 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.338170 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.338195 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.441408 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.441460 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.441481 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.441506 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.441526 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.545051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.545128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.545154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.545182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.545200 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.648726 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.648844 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.648869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.648902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.648921 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.748497 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.751124 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.751215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.751241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.751279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.751300 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.754685 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.755023 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.779537 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.799948 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.828374 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.847971 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.854587 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.854641 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.854658 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.854682 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.854696 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.868033 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.889781 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.913077 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.934084 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.949850 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.958318 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.959376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.959637 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.959809 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.959934 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:36Z","lastTransitionTime":"2025-12-02T07:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.968955 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.979724 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.979828 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.979897 4895 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.980008 4895 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.980030 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:25:40.979998455 +0000 UTC m=+152.150858068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:24:36 crc kubenswrapper[4895]: E1202 07:24:36.980082 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:25:40.980059237 +0000 UTC m=+152.150918880 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:24:36 crc kubenswrapper[4895]: I1202 07:24:36.988393 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.002178 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.021348 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.044078 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.063849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.063915 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.063937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.064054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.064134 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.067530 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.081127 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081373 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:41.081332651 +0000 UTC m=+152.252192304 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.081464 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.081680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081734 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081807 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081832 4895 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081974 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:25:41.081940299 +0000 UTC m=+152.252800082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.081979 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.082019 4895 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.082039 4895 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.082100 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:25:41.082084733 +0000 UTC m=+152.252944386 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.092191 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.120075 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:35Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 18 objects: [openshift-multus/multus-hlxqt openshift-multus/network-metrics-daemon-5f88v openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-w54m4 openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-wfcg7 openshift-multus/multus-additional-cni-plugins-n7xcr openshift-network-diagnostics/network-check-target-xd92c openshift-kube-controller-manager/kube-controller-manager-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p openshift-image-registry/node-ca-lhpbd openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-74fkh]\\\\nF1202 07:24:35.228140 6919 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.140789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.140997 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.141488 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:37 crc kubenswrapper[4895]: E1202 07:24:37.141611 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169091 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169146 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169170 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.169451 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de3d6380-3929-4a07-8c35-cc726a255414\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aecebf2f414351a8a75de9a970c8cc3c71debd541c3a14b2c0bc47c1f1bd68d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e5217ebe699e87855da6368830efb6c0cfd2269f7e478b0eba4228c7332d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d46786f8b9f8e55028793082aeeac5223ae936eadec4f863779607746386dafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033075acf80c00c23e81564e5384b1179d58a79ed6786abc63b792cb07d7a7e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3094ac339ea29ea82679832060bdab28766b6776f271b79d7d187ccf42144e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.209821 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.271994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.272040 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.272052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.272070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.272083 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.376340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.376430 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.376450 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.376480 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.376501 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.479812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.479891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.479910 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.479939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.479958 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.582840 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.582889 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.582902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.582921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.582934 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.686440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.686503 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.686518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.686540 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.686559 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.789834 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.789882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.789893 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.789909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.789920 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.892722 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.892838 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.892871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.892908 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.892931 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.996410 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.996468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.996485 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.996509 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:37 crc kubenswrapper[4895]: I1202 07:24:37.996525 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:37Z","lastTransitionTime":"2025-12-02T07:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.064132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.064188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.064200 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.064222 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.064235 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.089243 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.095913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.095978 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.096002 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.096032 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.096055 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.126463 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.134507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.134572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.134598 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.134631 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.134653 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.140146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.140308 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.140146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.140658 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.156426 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.162990 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.163076 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.163103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.163561 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.163864 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.188122 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.193925 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.194006 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.194031 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.194059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.194086 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.217294 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:38 crc kubenswrapper[4895]: E1202 07:24:38.217679 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.220083 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.220135 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.220150 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.220172 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.220191 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.322875 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.322921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.322939 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.322965 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.322983 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.426665 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.426780 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.426804 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.426835 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.426862 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.530654 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.530737 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.530783 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.530839 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.530860 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.634602 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.634729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.634788 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.634820 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.634845 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.738436 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.738522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.738548 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.738583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.738606 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.842331 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.842383 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.842396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.842420 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.842435 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.945853 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.945937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.945959 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.945988 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:38 crc kubenswrapper[4895]: I1202 07:24:38.946014 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:38Z","lastTransitionTime":"2025-12-02T07:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.049459 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.049538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.049550 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.049572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.049590 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.140361 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.140449 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:39 crc kubenswrapper[4895]: E1202 07:24:39.140571 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:39 crc kubenswrapper[4895]: E1202 07:24:39.140800 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.152373 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.152436 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.152455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.152487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.152506 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.162039 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.186614 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.207022 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.223428 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.245379 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.255533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.255614 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.255640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.255678 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.255701 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.271177 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.292731 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.310690 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.334269 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.355371 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.359803 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.359879 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.359907 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.359941 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.359969 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.378624 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.402647 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.429595 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.447941 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.463420 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.463481 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.463497 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.463547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.463562 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.464921 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.491932 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de3d6380-3929-4a07-8c35-cc726a255414\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aecebf2f414351a8a75de9a970c8cc3c71debd541c3a14b2c0bc47c1f1bd68d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e5217ebe699e87855da6368830efb6c0cfd2269f7e478b0eba4228c7332d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d46786f8b9f8e55028793082aeeac5223ae936eadec4f863779607746386dafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033075acf80c00c23e81564e5384b1179d58a79ed6786abc63b792cb07d7a7e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3094ac339ea29ea82679832060bdab28766b6776f271b79d7d187ccf42144e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.516030 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.535881 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.557872 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:35Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 18 objects: [openshift-multus/multus-hlxqt openshift-multus/network-metrics-daemon-5f88v openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-w54m4 openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-wfcg7 openshift-multus/multus-additional-cni-plugins-n7xcr openshift-network-diagnostics/network-check-target-xd92c openshift-kube-controller-manager/kube-controller-manager-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p openshift-image-registry/node-ca-lhpbd openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-74fkh]\\\\nF1202 07:24:35.228140 6919 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:39Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.569235 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.569320 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.569332 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.569355 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.569404 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.672594 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.672664 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.672681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.672707 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.672722 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.775715 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.775814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.775834 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.775859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.775878 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.879250 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.879353 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.879372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.879400 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.879422 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.984129 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.984195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.984212 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.984241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:39 crc kubenswrapper[4895]: I1202 07:24:39.984259 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:39Z","lastTransitionTime":"2025-12-02T07:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.088611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.088681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.088701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.088733 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.088807 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.140326 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.140457 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:40 crc kubenswrapper[4895]: E1202 07:24:40.140573 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:40 crc kubenswrapper[4895]: E1202 07:24:40.140681 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.192666 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.192795 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.192824 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.192858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.192881 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.296188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.296239 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.296249 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.296268 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.296280 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.400458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.400528 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.400550 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.400578 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.400597 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.504271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.504353 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.504376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.504407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.504428 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.608553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.608652 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.608673 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.608706 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.608730 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.712146 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.712199 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.712213 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.712234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.712250 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.815573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.815629 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.815646 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.815669 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.815685 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.918845 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.918891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.918901 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.918921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:40 crc kubenswrapper[4895]: I1202 07:24:40.918934 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:40Z","lastTransitionTime":"2025-12-02T07:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.022477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.022548 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.022566 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.022592 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.022611 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.126294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.126350 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.126365 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.126387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.126403 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.140179 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.140179 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:41 crc kubenswrapper[4895]: E1202 07:24:41.140399 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:41 crc kubenswrapper[4895]: E1202 07:24:41.140505 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.241306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.241388 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.241406 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.241433 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.241452 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.345627 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.345703 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.345720 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.345774 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.345798 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.450416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.450513 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.450540 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.450590 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.450618 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.554585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.554674 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.554697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.554732 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.554802 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.658020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.658112 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.658130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.658160 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.658184 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.761446 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.761547 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.761569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.761600 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.761621 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.865426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.865490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.865505 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.865526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.865540 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.968464 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.968524 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.968533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.968549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:41 crc kubenswrapper[4895]: I1202 07:24:41.968559 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:41Z","lastTransitionTime":"2025-12-02T07:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.071286 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.071379 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.071402 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.071436 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.071460 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.140652 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.140726 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:42 crc kubenswrapper[4895]: E1202 07:24:42.140883 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:42 crc kubenswrapper[4895]: E1202 07:24:42.141092 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.175218 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.175296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.175315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.175350 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.175374 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.279095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.279189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.279215 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.279253 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.279279 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.383194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.383257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.383275 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.383304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.383321 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.487693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.487802 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.487822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.487849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.487870 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.590971 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.591038 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.591054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.591081 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.591096 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.694422 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.694509 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.694529 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.694562 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.694584 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.798066 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.798144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.798162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.798190 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.798209 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.901856 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.901916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.901946 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.901972 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:42 crc kubenswrapper[4895]: I1202 07:24:42.901992 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:42Z","lastTransitionTime":"2025-12-02T07:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.005209 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.005261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.005277 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.005301 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.005317 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.108627 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.108768 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.108796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.108828 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.108855 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.140957 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.141003 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:43 crc kubenswrapper[4895]: E1202 07:24:43.141134 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:43 crc kubenswrapper[4895]: E1202 07:24:43.141243 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.212729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.212790 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.212804 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.212826 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.212841 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.315567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.315617 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.315626 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.315641 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.315651 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.418036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.418076 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.418085 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.418102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.418113 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.522361 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.522450 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.522469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.522501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.522519 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.625918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.625980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.625998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.626026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.626044 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.729400 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.729455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.729474 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.729501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.729521 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.832823 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.832893 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.832919 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.832951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.832976 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.935489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.935575 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.935611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.935646 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:43 crc kubenswrapper[4895]: I1202 07:24:43.935669 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:43Z","lastTransitionTime":"2025-12-02T07:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.039131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.039173 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.039183 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.039202 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.039216 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.140432 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.140782 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:44 crc kubenswrapper[4895]: E1202 07:24:44.140930 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:44 crc kubenswrapper[4895]: E1202 07:24:44.141073 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.142343 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.142368 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.142377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.142394 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.142406 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.245810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.245893 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.245918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.245947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.245968 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.349452 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.349535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.349558 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.349588 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.349611 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.452624 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.452684 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.452709 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.452773 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.452800 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.556132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.556196 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.556222 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.556257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.556279 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.659233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.659323 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.659346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.659377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.659403 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.763399 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.763451 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.763468 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.763488 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.763504 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.867228 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.867291 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.867312 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.867341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.867361 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.971357 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.971442 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.971466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.971499 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:44 crc kubenswrapper[4895]: I1202 07:24:44.971524 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:44Z","lastTransitionTime":"2025-12-02T07:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.075342 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.075398 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.075414 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.075434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.075448 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.140465 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.140466 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:45 crc kubenswrapper[4895]: E1202 07:24:45.140831 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:45 crc kubenswrapper[4895]: E1202 07:24:45.140911 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.178279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.178377 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.178398 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.178455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.178479 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.281789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.281886 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.281913 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.281951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.281978 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.385477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.385582 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.385606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.385681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.385707 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.489725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.489847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.489891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.489925 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.489947 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.592490 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.592567 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.592581 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.592601 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.592613 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.695845 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.695904 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.695918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.695940 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.695954 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.799192 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.799244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.799259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.799278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.799289 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.901954 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.901994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.902004 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.902019 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:45 crc kubenswrapper[4895]: I1202 07:24:45.902029 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:45Z","lastTransitionTime":"2025-12-02T07:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.005793 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.005896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.006145 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.006179 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.006197 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.110964 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.111032 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.111054 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.111084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.111102 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.140441 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.140449 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:46 crc kubenswrapper[4895]: E1202 07:24:46.140779 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:46 crc kubenswrapper[4895]: E1202 07:24:46.140602 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.220858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.221306 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.221670 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.221721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.222185 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.325051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.325085 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.325095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.325110 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.325120 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.428039 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.428095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.428115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.428175 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.428193 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.532597 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.532683 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.532703 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.532732 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.532782 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.636618 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.636704 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.636731 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.636802 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.636829 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.740284 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.740361 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.740378 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.740405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.740427 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.843511 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.843621 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.843649 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.843685 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.843718 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.946697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.946836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.946861 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.946888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:46 crc kubenswrapper[4895]: I1202 07:24:46.946906 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:46Z","lastTransitionTime":"2025-12-02T07:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.050447 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.050796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.050887 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.051008 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.051108 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.141091 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.141161 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:47 crc kubenswrapper[4895]: E1202 07:24:47.141915 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:47 crc kubenswrapper[4895]: E1202 07:24:47.142075 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.154469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.154530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.154545 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.154570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.154584 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.257996 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.258067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.258086 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.258118 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.258137 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.361852 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.361915 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.361929 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.361947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.361961 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.465278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.465367 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.465382 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.465405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.465418 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.568270 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.568665 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.568735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.568846 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.568913 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.672073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.672659 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.672958 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.673283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.673467 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.778340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.778872 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.779100 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.779277 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.779428 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.882890 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.882969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.882986 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.883015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.883042 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.986161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.986228 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.986253 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.986283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:47 crc kubenswrapper[4895]: I1202 07:24:47.986305 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:47Z","lastTransitionTime":"2025-12-02T07:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.089381 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.089454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.089492 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.089532 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.089561 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.140544 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.140701 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.140868 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.141029 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.193374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.193457 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.193487 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.193522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.193547 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.297678 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.297812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.297836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.297869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.297892 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.401127 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.401190 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.401211 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.401241 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.401258 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.495862 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.495923 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.495945 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.495972 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.495991 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.519087 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.525005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.525070 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.525082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.525103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.525118 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.543152 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.556023 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.556092 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.556110 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.556134 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.556153 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.578130 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.583244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.583327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.583344 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.583370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.583387 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.603270 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.610283 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.610327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.610343 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.610365 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.610381 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.629562 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:48Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:48 crc kubenswrapper[4895]: E1202 07:24:48.629723 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.632047 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.632101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.632115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.632140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.632153 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.735642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.735725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.735826 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.735866 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.735891 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.838878 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.838921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.838932 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.838949 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.838961 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.942914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.942969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.942982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.943006 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:48 crc kubenswrapper[4895]: I1202 07:24:48.943020 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:48Z","lastTransitionTime":"2025-12-02T07:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.045779 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.045858 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.045876 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.045905 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.045925 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.140436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:49 crc kubenswrapper[4895]: E1202 07:24:49.140698 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.140875 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:49 crc kubenswrapper[4895]: E1202 07:24:49.141568 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.142136 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:24:49 crc kubenswrapper[4895]: E1202 07:24:49.142427 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.150882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.151137 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.151304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.151469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.151614 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.164228 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.189940 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.212580 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.230457 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.249891 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.256015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.256095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.256115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.256154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.256180 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.268452 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.290436 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.306010 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.324209 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.349400 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.359834 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.359911 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.359930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.359951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.359985 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.371779 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.395474 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.426671 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.443686 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.462551 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.464120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.464208 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.464229 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.464260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.464280 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.495164 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de3d6380-3929-4a07-8c35-cc726a255414\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aecebf2f414351a8a75de9a970c8cc3c71debd541c3a14b2c0bc47c1f1bd68d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e5217ebe699e87855da6368830efb6c0cfd2269f7e478b0eba4228c7332d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d46786f8b9f8e55028793082aeeac5223ae936eadec4f863779607746386dafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033075acf80c00c23e81564e5384b1179d58a79ed6786abc63b792cb07d7a7e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3094ac339ea29ea82679832060bdab28766b6776f271b79d7d187ccf42144e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.516712 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.538896 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.566414 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:35Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 18 objects: [openshift-multus/multus-hlxqt openshift-multus/network-metrics-daemon-5f88v openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-w54m4 openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-wfcg7 openshift-multus/multus-additional-cni-plugins-n7xcr openshift-network-diagnostics/network-check-target-xd92c openshift-kube-controller-manager/kube-controller-manager-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p openshift-image-registry/node-ca-lhpbd openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-74fkh]\\\\nF1202 07:24:35.228140 6919 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:49Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.568403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.568568 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.568589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.568609 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.568623 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.672460 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.672533 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.672553 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.672577 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.672599 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.775656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.775722 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.775765 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.775793 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.775812 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.878484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.878570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.878594 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.878624 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.878645 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.982278 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.982345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.982358 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.982379 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:49 crc kubenswrapper[4895]: I1202 07:24:49.982397 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:49Z","lastTransitionTime":"2025-12-02T07:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.085017 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.085052 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.085061 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.085076 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.085085 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.140635 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.140654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:50 crc kubenswrapper[4895]: E1202 07:24:50.141144 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:50 crc kubenswrapper[4895]: E1202 07:24:50.141310 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.188786 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.189141 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.189229 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.189372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.189490 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.295022 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.295073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.295089 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.295109 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.295128 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.398111 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.398166 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.398184 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.398207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.398218 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.501583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.501638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.501651 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.501671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.501684 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.605387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.605464 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.605484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.605516 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.605536 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.708956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.709067 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.709094 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.709135 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.709166 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.812937 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.812994 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.813011 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.813037 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.813055 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.916163 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.916233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.916250 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.916276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:50 crc kubenswrapper[4895]: I1202 07:24:50.916299 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:50Z","lastTransitionTime":"2025-12-02T07:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.018991 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.019089 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.019112 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.019140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.019159 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.122484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.122580 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.122606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.122643 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.122673 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.141138 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.141174 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:51 crc kubenswrapper[4895]: E1202 07:24:51.141563 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:51 crc kubenswrapper[4895]: E1202 07:24:51.141691 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.227316 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.227475 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.227507 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.227535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.227557 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.330910 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.330974 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.330989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.331010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.331025 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.434882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.434954 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.434973 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.435001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.435026 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.538077 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.538164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.538182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.538211 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.538232 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.641059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.641121 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.641131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.641149 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.641159 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.745056 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.745128 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.745147 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.745178 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.745196 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.850299 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.850374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.850389 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.850409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.850420 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.953981 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.954056 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.954082 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.954115 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:51 crc kubenswrapper[4895]: I1202 07:24:51.954140 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:51Z","lastTransitionTime":"2025-12-02T07:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.057095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.057174 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.057195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.057221 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.057241 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.140228 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.140278 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:52 crc kubenswrapper[4895]: E1202 07:24:52.140453 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:52 crc kubenswrapper[4895]: E1202 07:24:52.140553 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.160585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.160661 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.160682 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.160708 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.160727 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.263701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.263822 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.263852 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.263884 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.263914 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.366812 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.366878 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.366895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.366921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.366940 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.470254 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.470329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.470348 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.470374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.470394 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.573543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.573611 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.573630 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.573656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.573675 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.676821 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.676930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.676952 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.676980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.677000 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.780516 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.780592 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.780610 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.780636 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.780655 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.883864 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.883921 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.883933 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.883954 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.883970 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.987012 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.987071 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.987083 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.987104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:52 crc kubenswrapper[4895]: I1202 07:24:52.987115 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:52Z","lastTransitionTime":"2025-12-02T07:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.090235 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.090281 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.090294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.090313 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.090325 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.141132 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.141329 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:53 crc kubenswrapper[4895]: E1202 07:24:53.141591 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:53 crc kubenswrapper[4895]: E1202 07:24:53.141808 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.194038 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.194121 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.194144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.194358 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.194380 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.297847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.297888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.297900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.297916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.297927 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.401489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.401535 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.401550 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.401570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.401582 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.504403 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.504440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.504451 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.504470 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.504479 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.608274 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.608329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.608342 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.608361 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.608374 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.712581 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.712661 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.712681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.712713 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.712734 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.816081 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.816160 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.816181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.816213 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.816232 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.919956 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.920020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.920031 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.920051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:53 crc kubenswrapper[4895]: I1202 07:24:53.920065 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:53Z","lastTransitionTime":"2025-12-02T07:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.023101 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.023156 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.023169 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.023192 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.023207 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.126290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.126346 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.126362 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.126388 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.126405 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.140811 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.140925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:54 crc kubenswrapper[4895]: E1202 07:24:54.140966 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:54 crc kubenswrapper[4895]: E1202 07:24:54.141081 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.229612 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.229688 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.229713 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.229778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.229797 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.333248 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.333341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.333369 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.333406 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.333437 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.436849 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.436942 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.436969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.437004 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.437034 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.540185 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.540231 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.540243 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.540261 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.540271 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.644315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.644381 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.644401 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.644426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.644445 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.748141 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.748195 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.748210 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.748234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.748247 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.851324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.851376 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.851387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.851406 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.851418 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.955203 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.955298 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.955319 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.955356 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:54 crc kubenswrapper[4895]: I1202 07:24:54.955399 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:54Z","lastTransitionTime":"2025-12-02T07:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.058411 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.058465 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.058477 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.058496 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.058507 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.140481 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.140614 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:55 crc kubenswrapper[4895]: E1202 07:24:55.140683 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:55 crc kubenswrapper[4895]: E1202 07:24:55.140840 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.161455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.161492 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.161500 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.161518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.161528 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.264684 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.264825 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.264851 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.264880 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.264898 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.368447 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.368538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.368557 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.368583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.368607 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.471796 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.471903 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.471927 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.471962 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.471987 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.575387 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.575445 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.575460 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.575482 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.575528 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.678640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.678718 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.678769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.678808 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.678834 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.782530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.782628 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.782666 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.782701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.782725 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.885919 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.885982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.885999 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.886025 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.886045 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.989207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.989276 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.989294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.989327 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:55 crc kubenswrapper[4895]: I1202 07:24:55.989347 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:55Z","lastTransitionTime":"2025-12-02T07:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.091818 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.091896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.091938 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.091975 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.091999 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.140006 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.140064 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:56 crc kubenswrapper[4895]: E1202 07:24:56.140157 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:56 crc kubenswrapper[4895]: E1202 07:24:56.140316 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.182029 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:56 crc kubenswrapper[4895]: E1202 07:24:56.182231 4895 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:56 crc kubenswrapper[4895]: E1202 07:24:56.182314 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs podName:5af25091-1401-45d4-ae53-d2b469c879da nodeName:}" failed. No retries permitted until 2025-12-02 07:26:00.182291664 +0000 UTC m=+171.353151277 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs") pod "network-metrics-daemon-5f88v" (UID: "5af25091-1401-45d4-ae53-d2b469c879da") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.195310 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.195345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.195357 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.195374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.195386 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.298287 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.298354 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.298379 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.298412 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.298440 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.400287 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.400342 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.400356 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.400375 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.400392 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.509830 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.509891 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.509909 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.509935 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.509954 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.613637 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.613706 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.613729 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.613803 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.613829 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.716895 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.716969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.716987 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.717017 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.717036 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.821225 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.821294 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.821311 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.821341 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.821362 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.924450 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.924502 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.924518 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.924542 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:56 crc kubenswrapper[4895]: I1202 07:24:56.924556 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:56Z","lastTransitionTime":"2025-12-02T07:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.027558 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.027619 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.027632 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.027658 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.027675 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.130823 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.130860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.130871 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.130888 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.130898 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.140201 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.140289 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:57 crc kubenswrapper[4895]: E1202 07:24:57.140586 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:57 crc kubenswrapper[4895]: E1202 07:24:57.140698 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.233550 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.233639 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.233668 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.233701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.233724 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.336810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.336882 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.336902 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.336930 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.336949 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.441209 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.441280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.441290 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.441307 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.441318 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.545162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.545255 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.545279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.545313 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.545341 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.648759 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.648896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.648918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.648944 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.648960 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.753844 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.753922 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.753940 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.753973 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.753993 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.857207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.857259 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.857271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.857292 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.857307 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.960104 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.960163 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.960172 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.960187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:57 crc kubenswrapper[4895]: I1202 07:24:57.960196 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:57Z","lastTransitionTime":"2025-12-02T07:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.064411 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.064495 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.064520 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.064555 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.064581 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.140718 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.140719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.141003 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.141300 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.167660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.167776 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.167798 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.167831 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.167854 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.272297 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.272385 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.272407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.272438 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.272467 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.375425 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.375517 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.375544 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.375583 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.375612 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.484781 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.484856 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.484883 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.484918 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.484946 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.589152 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.589213 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.589231 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.589260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.589279 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.692675 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.692784 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.692803 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.692838 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.692861 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.796295 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.796369 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.796384 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.796406 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.796428 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.833526 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.833621 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.833652 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.833688 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.833714 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.859590 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.865204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.865273 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.865352 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.865396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.865424 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.890267 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.897630 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.897769 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.897847 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.897929 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.897998 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.919537 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.925324 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.925501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.925641 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.925829 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.925984 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.946420 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.951735 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.951969 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.952114 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.952253 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.952384 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.967171 4895 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"42683c5b-b2bf-439b-8ee4-25c8d72cfed1\\\",\\\"systemUUID\\\":\\\"92c636b1-dcb0-457f-b098-73baeaac297e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:58 crc kubenswrapper[4895]: E1202 07:24:58.967672 4895 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.970585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.970810 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.970966 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.971120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:58 crc kubenswrapper[4895]: I1202 07:24:58.971261 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:58Z","lastTransitionTime":"2025-12-02T07:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.074826 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.074900 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.074919 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.074950 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.074971 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.140884 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.141079 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:24:59 crc kubenswrapper[4895]: E1202 07:24:59.141387 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:24:59 crc kubenswrapper[4895]: E1202 07:24:59.141534 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.158537 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4604b0d5-cf8f-404d-af67-65da4b1dc003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b3c366f0f27ff793a37b2944887ab1c1bc155c14d5c8d6c267442b0a8666ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49d7c12ea2cd199dcfa46a6f1a1a856f1285a709def9d82b6805ba256a79fd5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2543d364f902a6c34a82dac25f389cd36afc2f717d772d0a487a640d44adf30f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.176117 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444ad02c4afa919779de59fd8c81bdce5c3426dd6451b90505921a8d5fa0c96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.177877 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.178102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.178264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.178418 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.178662 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.195914 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.216588 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3514f381-d0d1-4e00-931e-c5ca75202a1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2670ef1ea22a6c1c5d9ec4ee4b0345c575b20224fddf8655b95eaa431573d071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3deeb37702c8243a956f2be1bb61e2693bb9be5453d775bc3f3e85618d483015\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2c60eddf285bbbf1a5957691f56d895e5bb7461fb18ab7c495879357cfae565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e8b39d38597192393d90cc582b8ad109a1ed22c4c14f539e1a5445e6d817872\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb5abc8da49059e23c4013f3856d3f4c9078cd8871a79f8dae5ce4d8cb319ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbdd3e8943b356a485d900737682f0903dc04aefa33498ee5cdf3c82b6570852\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3641d1f7fd2ea062c216ce260c35d903273d3039fe790815ebb697f8c11d15cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n7xcr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.237704 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hlxqt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"30911fe5-208f-44e8-a380-2a0093f24863\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:24:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:24Z\\\",\\\"message\\\":\\\"2025-12-02T07:23:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6\\\\n2025-12-02T07:23:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8d90423c-940a-42e9-9258-17f96c4385a6 to /host/opt/cni/bin/\\\\n2025-12-02T07:23:39Z [verbose] multus-daemon started\\\\n2025-12-02T07:23:39Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:24:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:24:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6l8j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hlxqt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.254443 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5f88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af25091-1401-45d4-ae53-d2b469c879da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfrv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5f88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.267305 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2df36ca5-8a5e-4865-8728-b8567297546a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f076cfc3bd5e782332e827d56325b73cf2a9f35248f397338b5130793481af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de2bad51e8d51f1243a2fe7799554c7f089425e3623a6bb62f7393610d70fc4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b737e88ad2a8caf70b89f59db461d355846d5612a29a249107b49fbec176204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d30ec3b0f73c029956f058f1d8f06cfffb5533dcd0c221ab6842d277f274a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.282948 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.283001 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.283015 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.283036 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.283054 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.284105 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.300829 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lhpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81128292-8c02-45bd-9b25-e9457e989975\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cc63751b6428c88deed36b914b2a95a94a50b544c562126c2a9567c177b2570\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2z5n2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lhpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.314723 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0b438f-70fd-4c3f-a170-2b3fe9797955\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7329cf1034b2d51ebb7f978448558a8b6fa39d58823289c470fddcfed262e395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cbc34b4d78cc00ac79b9944d92c6ac61a9d5dd704f36a473950baf53dfcbf47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj6l9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wxp5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.341251 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de3d6380-3929-4a07-8c35-cc726a255414\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4aecebf2f414351a8a75de9a970c8cc3c71debd541c3a14b2c0bc47c1f1bd68d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://443e5217ebe699e87855da6368830efb6c0cfd2269f7e478b0eba4228c7332d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d46786f8b9f8e55028793082aeeac5223ae936eadec4f863779607746386dafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://033075acf80c00c23e81564e5384b1179d58a79ed6786abc63b792cb07d7a7e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3094ac339ea29ea82679832060bdab28766b6776f271b79d7d187ccf42144e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f5656c9cbe214094b8aa976e6a6022ce87fb0640b5aa640a98d4271c070f3ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://60184fc891c6407180abeded8004495660c5b52559c3132af2eef1c97e8f08ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3787d284b2bfc6855be2fdcf47aa69cc845b2bf227c053fa482b169a9ff5d5cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.363970 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2db13abc9ba56d955814e75ceea54b92610a00daeefa29aa1aad584e3453728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.384108 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bb4bea68942dbface0009fbac76f7b1253ab282e8decb9081225de7d6537ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3115f2c50d05ee406f583937e39b1b55f364ce7cbeb0788b75941a22e23f57f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.386438 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.386501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.386517 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.386536 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.386550 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.413630 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afc3334a-0153-4dcc-9a56-92f6cae51c08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:24:35Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 18 objects: [openshift-multus/multus-hlxqt openshift-multus/network-metrics-daemon-5f88v openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-node-w54m4 openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-machine-config-operator/machine-config-daemon-wfcg7 openshift-multus/multus-additional-cni-plugins-n7xcr openshift-network-diagnostics/network-check-target-xd92c openshift-kube-controller-manager/kube-controller-manager-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p openshift-image-registry/node-ca-lhpbd openshift-kube-apiserver/kube-apiserver-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-dns/node-resolver-74fkh]\\\\nF1202 07:24:35.228140 6919 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:24:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ldkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w54m4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.432925 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a06fda3-5f3f-4169-ae3c-f12424af7b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8044b0d3dc7f5c301eeeaee66d8461268ad266d0561461cd0b30bdc650cbdd99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73131074a12f26173aefdc0bde82a7446290cf2bec30f320cc37f8c4706ac50e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.447579 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b7fa1ad-7e38-4645-ab41-c7d395f5c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:23:32Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1202 07:23:21.537254 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:23:21.538901 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012964219/tls.crt::/tmp/serving-cert-1012964219/tls.key\\\\\\\"\\\\nI1202 07:23:32.848375 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:23:32.853597 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:23:32.853643 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:23:32.853680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:23:32.853691 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:23:32.869490 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:23:32.869536 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869542 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:23:32.869555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:23:32.869559 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:23:32.869564 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:23:32.869568 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:23:32.869624 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:23:32.874299 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:11Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:23:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.463921 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.475283 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-74fkh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b212bd8-f1a7-4982-b2e6-499c13a34b0c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4d58e6f37ee8f90af6aad88138c39e9a0753c16cceb763c2bd2ec9c30087d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhpwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-74fkh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489072 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489119 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489132 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489151 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489166 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.489988 4895 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0468d2d1-a975-45a6-af9f-47adc432fab0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebf8583b59d7955530da7a83a29f9994ba139c076cf7b76dc94e88d14f39e1a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vfwjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:23:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wfcg7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:24:59Z is after 2025-08-24T17:21:41Z" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.592342 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.592416 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.592434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.592458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.592477 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.696372 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.696431 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.696441 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.696459 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.696471 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.799320 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.799396 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.799413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.799440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.799458 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.902955 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.903026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.903043 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.903066 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:24:59 crc kubenswrapper[4895]: I1202 07:24:59.903085 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:24:59Z","lastTransitionTime":"2025-12-02T07:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.005951 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.006005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.006030 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.006064 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.006077 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.109179 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.109247 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.109271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.109317 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.109343 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.140951 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.141261 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:00 crc kubenswrapper[4895]: E1202 07:25:00.141443 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:00 crc kubenswrapper[4895]: E1202 07:25:00.141618 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.213569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.213633 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.213646 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.213672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.213687 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.317187 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.317257 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.317275 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.317304 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.317326 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.421349 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.421434 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.421458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.421489 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.421510 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.526409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.526466 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.526484 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.526510 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.526529 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.630982 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.631084 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.631116 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.631154 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.631183 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.734931 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.735020 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.735042 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.735073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.735092 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.839218 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.839301 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.839329 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.839364 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.839392 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.943044 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.943103 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.943120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.943144 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:00 crc kubenswrapper[4895]: I1202 07:25:00.943165 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:00Z","lastTransitionTime":"2025-12-02T07:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.045904 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.045980 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.045998 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.046026 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.046044 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.140478 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.140488 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:01 crc kubenswrapper[4895]: E1202 07:25:01.140978 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:01 crc kubenswrapper[4895]: E1202 07:25:01.141162 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.141908 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:25:01 crc kubenswrapper[4895]: E1202 07:25:01.142092 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.149527 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.149560 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.149573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.149591 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.149603 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.253660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.253718 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.253755 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.253780 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.253798 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.357130 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.357207 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.357233 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.357262 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.357285 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.460141 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.460191 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.460209 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.460234 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.460252 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.569309 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.569375 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.569386 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.569405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.569417 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.672182 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.672242 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.672258 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.672282 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.672297 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.776194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.776271 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.776296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.776334 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.776358 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.878795 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.878865 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.878884 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.878914 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.878934 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.982345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.982397 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.982407 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.982427 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:01 crc kubenswrapper[4895]: I1202 07:25:01.982439 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:01Z","lastTransitionTime":"2025-12-02T07:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.086374 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.086458 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.086480 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.086508 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.086526 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.140288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.140325 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:02 crc kubenswrapper[4895]: E1202 07:25:02.140477 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:02 crc kubenswrapper[4895]: E1202 07:25:02.140642 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.190040 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.190095 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.190131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.190167 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.190224 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.294138 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.294213 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.294225 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.294242 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.294251 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.397502 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.397575 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.397592 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.397619 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.397638 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.500593 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.500721 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.500802 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.500840 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.500861 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.604370 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.604426 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.604440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.604457 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.604470 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.708524 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.708595 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.708616 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.708647 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.708812 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.812807 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.812941 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.813068 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.813106 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.813126 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.915638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.915686 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.915697 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.915716 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:02 crc kubenswrapper[4895]: I1202 07:25:02.915730 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:02Z","lastTransitionTime":"2025-12-02T07:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.018581 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.018633 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.018644 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.018661 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.018671 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.122672 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.122775 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.122788 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.122814 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.122828 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.141164 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.141255 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:03 crc kubenswrapper[4895]: E1202 07:25:03.141347 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:03 crc kubenswrapper[4895]: E1202 07:25:03.141471 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.225963 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.226035 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.226057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.226088 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.226154 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.329211 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.329269 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.329279 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.329297 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.329310 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.431959 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.432010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.432022 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.432043 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.432054 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.535546 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.535589 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.535600 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.535620 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.535631 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.639606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.639681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.639701 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.639730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.639785 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.743317 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.743386 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.743405 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.743440 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.743464 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.847075 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.847161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.847190 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.847224 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.847243 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.952004 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.952060 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.952078 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.952106 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:03 crc kubenswrapper[4895]: I1202 07:25:03.952125 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:03Z","lastTransitionTime":"2025-12-02T07:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.054916 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.054977 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.054989 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.055012 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.055025 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.141053 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.141135 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:04 crc kubenswrapper[4895]: E1202 07:25:04.141338 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:04 crc kubenswrapper[4895]: E1202 07:25:04.141794 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.159223 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.159264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.159272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.159289 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.159300 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.262585 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.262642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.262660 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.262685 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.262700 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.365874 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.365957 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.365984 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.366057 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.366085 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.469363 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.469454 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.469480 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.469509 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.469532 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.572997 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.573106 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.573131 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.573164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.573183 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.677406 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.677455 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.677465 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.677482 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.677491 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.781005 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.781056 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.781073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.781097 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.781120 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.883693 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.883730 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.883771 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.883789 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.883801 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.986457 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.986511 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.986530 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.986569 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:04 crc kubenswrapper[4895]: I1202 07:25:04.986607 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:04Z","lastTransitionTime":"2025-12-02T07:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.089148 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.089181 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.089189 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.089204 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.089214 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.141222 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.141225 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:05 crc kubenswrapper[4895]: E1202 07:25:05.141416 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:05 crc kubenswrapper[4895]: E1202 07:25:05.141493 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.193923 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.194059 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.194087 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.194122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.194147 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.298194 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.298263 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.298280 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.298308 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.298326 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.401578 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.401638 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.401656 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.401681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.401698 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.505090 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.505171 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.505193 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.505225 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.505249 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.607702 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.607836 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.607859 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.607893 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.607929 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.711896 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.711970 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.711987 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.712016 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.712041 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.815021 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.815129 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.815142 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.815164 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.815177 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.918534 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.918623 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.918648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.918681 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:05 crc kubenswrapper[4895]: I1202 07:25:05.918705 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:05Z","lastTransitionTime":"2025-12-02T07:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.021860 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.022476 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.022506 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.022541 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.022563 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.126158 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.126270 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.126326 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.126354 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.126413 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.141029 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.141048 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:06 crc kubenswrapper[4895]: E1202 07:25:06.141204 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:06 crc kubenswrapper[4895]: E1202 07:25:06.141319 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.229538 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.229596 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.229613 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.229642 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.229660 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.332885 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.332929 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.332947 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.332971 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.332986 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.435875 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.436010 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.436037 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.436065 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.436111 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.539503 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.539573 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.539606 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.539724 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.539782 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.642340 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.642417 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.642436 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.642467 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.642489 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.746272 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.746345 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.746363 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.746390 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.746410 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.849409 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.849473 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.849494 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.849522 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.849543 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.953205 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.953267 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.953288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.953315 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:06 crc kubenswrapper[4895]: I1202 07:25:06.953332 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:06Z","lastTransitionTime":"2025-12-02T07:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.056188 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.056248 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.056267 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.056297 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.056315 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.140377 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.140392 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:07 crc kubenswrapper[4895]: E1202 07:25:07.140621 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:07 crc kubenswrapper[4895]: E1202 07:25:07.140691 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.160013 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.160073 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.160161 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.160869 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.160918 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.264572 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.264631 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.264648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.264671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.264688 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.367549 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.367620 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.367640 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.367671 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.367690 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.471179 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.471242 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.471260 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.471288 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.471308 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.574007 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.574077 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.574094 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.574122 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.574141 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.677521 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.677594 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.677617 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.677648 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.677670 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.781326 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.781390 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.781413 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.781448 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.781475 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.885172 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.885244 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.885264 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.885296 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.885316 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.989051 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.989102 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.989120 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.989143 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:07 crc kubenswrapper[4895]: I1202 07:25:07.989158 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:07Z","lastTransitionTime":"2025-12-02T07:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.093469 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.093524 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.093543 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.093570 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.093589 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.140365 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.140484 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:08 crc kubenswrapper[4895]: E1202 07:25:08.140668 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:08 crc kubenswrapper[4895]: E1202 07:25:08.141202 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.209269 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.209322 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.209334 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.209354 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.209368 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.313237 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.313300 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.313316 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.313338 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.313357 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.417705 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.417861 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.417881 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.417910 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.417930 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.521579 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.521688 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.521725 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.521799 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.521819 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.625837 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.625911 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.625928 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.625952 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.625972 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.729501 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.729568 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.729590 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.729622 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.729644 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.833237 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.833331 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.833358 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.833392 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.833417 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.936384 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.936456 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.936480 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.936510 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:08 crc kubenswrapper[4895]: I1202 07:25:08.936534 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:08Z","lastTransitionTime":"2025-12-02T07:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.040071 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.040129 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.040140 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.040162 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.040173 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:09Z","lastTransitionTime":"2025-12-02T07:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.066680 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.066778 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.066791 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.066813 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.066828 4895 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:25:09Z","lastTransitionTime":"2025-12-02T07:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.139054 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj"] Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.139644 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: E1202 07:25:09.140366 4895 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.143913 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:09 crc kubenswrapper[4895]: E1202 07:25:09.144019 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.144232 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:09 crc kubenswrapper[4895]: E1202 07:25:09.144292 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.145299 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.145576 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.150632 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.156210 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.169461 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=36.169430094 podStartE2EDuration="36.169430094s" podCreationTimestamp="2025-12-02 07:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.169333461 +0000 UTC m=+120.340193124" watchObservedRunningTime="2025-12-02 07:25:09.169430094 +0000 UTC m=+120.340289747" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.189090 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.189070986 podStartE2EDuration="1m36.189070986s" podCreationTimestamp="2025-12-02 07:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.188189929 +0000 UTC m=+120.359049562" watchObservedRunningTime="2025-12-02 07:25:09.189070986 +0000 UTC m=+120.359930629" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.236255 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.236304 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17ca2586-a0ef-41f0-b8cc-83b778697ad2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.236350 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.236370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ca2586-a0ef-41f0-b8cc-83b778697ad2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.236402 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ca2586-a0ef-41f0-b8cc-83b778697ad2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.242627 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-74fkh" podStartSLOduration=92.242609217 podStartE2EDuration="1m32.242609217s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.226793392 +0000 UTC m=+120.397653065" watchObservedRunningTime="2025-12-02 07:25:09.242609217 +0000 UTC m=+120.413468830" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.259830 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podStartSLOduration=92.259801014 podStartE2EDuration="1m32.259801014s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.243222235 +0000 UTC m=+120.414081848" watchObservedRunningTime="2025-12-02 07:25:09.259801014 +0000 UTC m=+120.430660627" Dec 02 07:25:09 crc kubenswrapper[4895]: E1202 07:25:09.276223 4895 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.281090 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=95.281064175 podStartE2EDuration="1m35.281064175s" podCreationTimestamp="2025-12-02 07:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.280151607 +0000 UTC m=+120.451011260" watchObservedRunningTime="2025-12-02 07:25:09.281064175 +0000 UTC m=+120.451923868" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.332164 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n7xcr" podStartSLOduration=92.33212955 podStartE2EDuration="1m32.33212955s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.331226282 +0000 UTC m=+120.502085905" watchObservedRunningTime="2025-12-02 07:25:09.33212955 +0000 UTC m=+120.502989203" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338057 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ca2586-a0ef-41f0-b8cc-83b778697ad2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338147 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ca2586-a0ef-41f0-b8cc-83b778697ad2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338199 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338230 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17ca2586-a0ef-41f0-b8cc-83b778697ad2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338340 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.338367 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/17ca2586-a0ef-41f0-b8cc-83b778697ad2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.340492 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/17ca2586-a0ef-41f0-b8cc-83b778697ad2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.352655 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17ca2586-a0ef-41f0-b8cc-83b778697ad2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.353600 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hlxqt" podStartSLOduration=92.353577937 podStartE2EDuration="1m32.353577937s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.352507954 +0000 UTC m=+120.523367597" watchObservedRunningTime="2025-12-02 07:25:09.353577937 +0000 UTC m=+120.524437580" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.372413 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17ca2586-a0ef-41f0-b8cc-83b778697ad2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cc7gj\" (UID: \"17ca2586-a0ef-41f0-b8cc-83b778697ad2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.387443 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=71.387405604 podStartE2EDuration="1m11.387405604s" podCreationTimestamp="2025-12-02 07:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.369286678 +0000 UTC m=+120.540146291" watchObservedRunningTime="2025-12-02 07:25:09.387405604 +0000 UTC m=+120.558265257" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.399322 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lhpbd" podStartSLOduration=92.399294668 podStartE2EDuration="1m32.399294668s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.398950928 +0000 UTC m=+120.569810551" watchObservedRunningTime="2025-12-02 07:25:09.399294668 +0000 UTC m=+120.570154271" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.412280 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wxp5p" podStartSLOduration=92.412260425 podStartE2EDuration="1m32.412260425s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.411641166 +0000 UTC m=+120.582500789" watchObservedRunningTime="2025-12-02 07:25:09.412260425 +0000 UTC m=+120.583120028" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.465003 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=33.46497259 podStartE2EDuration="33.46497259s" podCreationTimestamp="2025-12-02 07:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.462309689 +0000 UTC m=+120.633169312" watchObservedRunningTime="2025-12-02 07:25:09.46497259 +0000 UTC m=+120.635832243" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.469606 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.885823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" event={"ID":"17ca2586-a0ef-41f0-b8cc-83b778697ad2","Type":"ContainerStarted","Data":"df746389e591d15e970eaf6fe10f545eb6091dd2ed24275ac52e7dc14bc89534"} Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.885892 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" event={"ID":"17ca2586-a0ef-41f0-b8cc-83b778697ad2","Type":"ContainerStarted","Data":"98c7b21c49b2178f78c392aa673043b2ff966241be4cd9d6220c963260d3f70d"} Dec 02 07:25:09 crc kubenswrapper[4895]: I1202 07:25:09.903647 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cc7gj" podStartSLOduration=92.903616183 podStartE2EDuration="1m32.903616183s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:09.902620992 +0000 UTC m=+121.073480645" watchObservedRunningTime="2025-12-02 07:25:09.903616183 +0000 UTC m=+121.074475836" Dec 02 07:25:10 crc kubenswrapper[4895]: I1202 07:25:10.141032 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:10 crc kubenswrapper[4895]: I1202 07:25:10.141098 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:10 crc kubenswrapper[4895]: E1202 07:25:10.141445 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:10 crc kubenswrapper[4895]: E1202 07:25:10.141555 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.141090 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:11 crc kubenswrapper[4895]: E1202 07:25:11.141326 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.141956 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:11 crc kubenswrapper[4895]: E1202 07:25:11.142096 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.894013 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/1.log" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.894650 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/0.log" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.894723 4895 generic.go:334] "Generic (PLEG): container finished" podID="30911fe5-208f-44e8-a380-2a0093f24863" containerID="a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f" exitCode=1 Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.894799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerDied","Data":"a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f"} Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.894845 4895 scope.go:117] "RemoveContainer" containerID="87ebbf4f529d4e7b64c392912d06e57dc80a2237a9a4e31ab5b56ac556aa10b1" Dec 02 07:25:11 crc kubenswrapper[4895]: I1202 07:25:11.895536 4895 scope.go:117] "RemoveContainer" containerID="a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f" Dec 02 07:25:11 crc kubenswrapper[4895]: E1202 07:25:11.895843 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hlxqt_openshift-multus(30911fe5-208f-44e8-a380-2a0093f24863)\"" pod="openshift-multus/multus-hlxqt" podUID="30911fe5-208f-44e8-a380-2a0093f24863" Dec 02 07:25:12 crc kubenswrapper[4895]: I1202 07:25:12.140656 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:12 crc kubenswrapper[4895]: I1202 07:25:12.140700 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:12 crc kubenswrapper[4895]: E1202 07:25:12.140833 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:12 crc kubenswrapper[4895]: E1202 07:25:12.140949 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:12 crc kubenswrapper[4895]: I1202 07:25:12.899810 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/1.log" Dec 02 07:25:13 crc kubenswrapper[4895]: I1202 07:25:13.141249 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:13 crc kubenswrapper[4895]: E1202 07:25:13.141464 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:13 crc kubenswrapper[4895]: I1202 07:25:13.141540 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:13 crc kubenswrapper[4895]: E1202 07:25:13.141705 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:14 crc kubenswrapper[4895]: I1202 07:25:14.140257 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:14 crc kubenswrapper[4895]: I1202 07:25:14.140283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:14 crc kubenswrapper[4895]: E1202 07:25:14.140477 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:14 crc kubenswrapper[4895]: E1202 07:25:14.141168 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:14 crc kubenswrapper[4895]: I1202 07:25:14.141867 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:25:14 crc kubenswrapper[4895]: E1202 07:25:14.142193 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w54m4_openshift-ovn-kubernetes(afc3334a-0153-4dcc-9a56-92f6cae51c08)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" Dec 02 07:25:14 crc kubenswrapper[4895]: E1202 07:25:14.278131 4895 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:25:15 crc kubenswrapper[4895]: I1202 07:25:15.140416 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:15 crc kubenswrapper[4895]: E1202 07:25:15.140643 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:15 crc kubenswrapper[4895]: I1202 07:25:15.140723 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:15 crc kubenswrapper[4895]: E1202 07:25:15.141208 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:16 crc kubenswrapper[4895]: I1202 07:25:16.140950 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:16 crc kubenswrapper[4895]: E1202 07:25:16.141156 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:16 crc kubenswrapper[4895]: I1202 07:25:16.141478 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:16 crc kubenswrapper[4895]: E1202 07:25:16.141572 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:17 crc kubenswrapper[4895]: I1202 07:25:17.140393 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:17 crc kubenswrapper[4895]: I1202 07:25:17.140396 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:17 crc kubenswrapper[4895]: E1202 07:25:17.140662 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:17 crc kubenswrapper[4895]: E1202 07:25:17.140860 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:18 crc kubenswrapper[4895]: I1202 07:25:18.140911 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:18 crc kubenswrapper[4895]: I1202 07:25:18.140943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:18 crc kubenswrapper[4895]: E1202 07:25:18.141157 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:18 crc kubenswrapper[4895]: E1202 07:25:18.141426 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:19 crc kubenswrapper[4895]: I1202 07:25:19.141079 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:19 crc kubenswrapper[4895]: I1202 07:25:19.141270 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:19 crc kubenswrapper[4895]: E1202 07:25:19.143064 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:19 crc kubenswrapper[4895]: E1202 07:25:19.143254 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:19 crc kubenswrapper[4895]: E1202 07:25:19.279291 4895 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:25:20 crc kubenswrapper[4895]: I1202 07:25:20.140293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:20 crc kubenswrapper[4895]: I1202 07:25:20.140409 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:20 crc kubenswrapper[4895]: E1202 07:25:20.141012 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:20 crc kubenswrapper[4895]: E1202 07:25:20.141126 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:21 crc kubenswrapper[4895]: I1202 07:25:21.140723 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:21 crc kubenswrapper[4895]: I1202 07:25:21.140723 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:21 crc kubenswrapper[4895]: E1202 07:25:21.142126 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:21 crc kubenswrapper[4895]: E1202 07:25:21.142404 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:22 crc kubenswrapper[4895]: I1202 07:25:22.140835 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:22 crc kubenswrapper[4895]: E1202 07:25:22.140994 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:22 crc kubenswrapper[4895]: I1202 07:25:22.140836 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:22 crc kubenswrapper[4895]: E1202 07:25:22.141070 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:23 crc kubenswrapper[4895]: I1202 07:25:23.141306 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:23 crc kubenswrapper[4895]: I1202 07:25:23.141306 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:23 crc kubenswrapper[4895]: E1202 07:25:23.141580 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:23 crc kubenswrapper[4895]: E1202 07:25:23.141732 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:24 crc kubenswrapper[4895]: I1202 07:25:24.141176 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:24 crc kubenswrapper[4895]: I1202 07:25:24.141526 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:24 crc kubenswrapper[4895]: I1202 07:25:24.141801 4895 scope.go:117] "RemoveContainer" containerID="a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f" Dec 02 07:25:24 crc kubenswrapper[4895]: E1202 07:25:24.141856 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:24 crc kubenswrapper[4895]: E1202 07:25:24.142224 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:24 crc kubenswrapper[4895]: E1202 07:25:24.279963 4895 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:25:24 crc kubenswrapper[4895]: I1202 07:25:24.945963 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/1.log" Dec 02 07:25:24 crc kubenswrapper[4895]: I1202 07:25:24.946253 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerStarted","Data":"da2cfa8cea74106ff83eeb39671986f152230f16024c55af625bfdc4dca6a73d"} Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.140654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.141076 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:25 crc kubenswrapper[4895]: E1202 07:25:25.141070 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:25 crc kubenswrapper[4895]: E1202 07:25:25.141838 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.142164 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.949187 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5f88v"] Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.949382 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:25 crc kubenswrapper[4895]: E1202 07:25:25.949536 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.959725 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.967666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerStarted","Data":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} Dec 02 07:25:25 crc kubenswrapper[4895]: I1202 07:25:25.968572 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:25:26 crc kubenswrapper[4895]: I1202 07:25:26.007642 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podStartSLOduration=109.007622044 podStartE2EDuration="1m49.007622044s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:26.005495364 +0000 UTC m=+137.176354987" watchObservedRunningTime="2025-12-02 07:25:26.007622044 +0000 UTC m=+137.178481657" Dec 02 07:25:26 crc kubenswrapper[4895]: I1202 07:25:26.140132 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:26 crc kubenswrapper[4895]: E1202 07:25:26.140833 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:27 crc kubenswrapper[4895]: I1202 07:25:27.140512 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:27 crc kubenswrapper[4895]: I1202 07:25:27.140636 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:27 crc kubenswrapper[4895]: E1202 07:25:27.140716 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:27 crc kubenswrapper[4895]: E1202 07:25:27.140908 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:28 crc kubenswrapper[4895]: I1202 07:25:28.140400 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:28 crc kubenswrapper[4895]: I1202 07:25:28.140499 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:28 crc kubenswrapper[4895]: E1202 07:25:28.140712 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5f88v" podUID="5af25091-1401-45d4-ae53-d2b469c879da" Dec 02 07:25:28 crc kubenswrapper[4895]: E1202 07:25:28.140918 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.140441 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:29 crc kubenswrapper[4895]: E1202 07:25:29.142542 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.142793 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:29 crc kubenswrapper[4895]: E1202 07:25:29.143083 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.456899 4895 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.507889 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dz88z"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.508231 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.508456 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s9km8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.513164 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.513323 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.513202 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.516115 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfs6x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.524185 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.524460 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525080 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525166 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525308 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525488 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525760 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525860 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.525956 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.526036 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.526072 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.526111 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.526357 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.526915 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.527075 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.527971 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9mn97"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.528582 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.529169 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.529402 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.529622 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.529887 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.529971 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.530878 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.530029 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.530589 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.530641 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.531396 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.531682 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.531576 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.532618 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d7hb4"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.537589 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.532815 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.538465 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.533892 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534219 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534256 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.539061 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534296 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534335 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534546 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534614 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534636 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.534794 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.539369 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.539938 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.544726 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.544834 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.545632 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.546164 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.546558 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.546763 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.546937 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.547990 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.548052 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.548143 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.549028 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6g8m8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.549568 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.550776 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lf9fx"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.551158 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.551524 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.551607 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.552152 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.552205 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.552228 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.552252 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.552649 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.553528 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.553735 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.555879 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556357 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556558 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556696 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556785 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556881 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.556905 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.557726 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.557876 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.582173 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2tvrt"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.585090 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.585637 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.585776 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-config\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.585968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9s7q\" (UniqueName: \"kubernetes.io/projected/bfe88c24-f4ac-410d-8692-81fe612083e7-kube-api-access-c9s7q\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586029 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586068 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccx5f\" (UniqueName: \"kubernetes.io/projected/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-kube-api-access-ccx5f\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586102 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq6cb\" (UniqueName: \"kubernetes.io/projected/2a937aec-9d85-4924-b88f-69200cab4ee5-kube-api-access-rq6cb\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-auth-proxy-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-audit\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586194 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8zx4\" (UniqueName: \"kubernetes.io/projected/c48181d0-7322-418a-8f38-7e3450675f0e-kube-api-access-w8zx4\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586249 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe88c24-f4ac-410d-8692-81fe612083e7-serving-cert\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586279 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586306 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586332 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-machine-approver-tls\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586404 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586442 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-encryption-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586493 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-image-import-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586511 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-serving-cert\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586701 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a937aec-9d85-4924-b88f-69200cab4ee5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586787 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-audit-dir\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-client\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586860 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a937aec-9d85-4924-b88f-69200cab4ee5-serving-cert\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586890 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.586989 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-49t7q"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.589354 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.589424 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.590117 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.594298 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.594433 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.594614 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.594757 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.595554 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.595553 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.595594 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.596470 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.596855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.598888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.599036 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.615528 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.615757 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.615877 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.615888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616032 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616314 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616400 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616500 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616579 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616627 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.616730 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.617001 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.618396 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.619010 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.620357 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.622003 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.623138 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.623815 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.624082 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.626861 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627161 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627376 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627476 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627482 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627583 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.627985 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.628108 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.628283 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.628458 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.629316 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.629410 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.629519 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.631859 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.632006 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.632294 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.632442 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.632528 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.632942 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.635266 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.637130 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dz88z"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.638024 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.638224 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.639315 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.639836 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.640027 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zj6vq"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.640604 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.640611 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.640709 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.641539 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.641864 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.642697 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.643541 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.644607 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.646024 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.649067 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.655619 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.656567 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.657576 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.658787 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.661022 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.661156 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.664411 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s9km8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.664454 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.665482 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.666270 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.678341 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.678689 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.679026 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.679104 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.680826 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.683493 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.684380 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.686121 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688069 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688106 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccx5f\" (UniqueName: \"kubernetes.io/projected/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-kube-api-access-ccx5f\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688135 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq6cb\" (UniqueName: \"kubernetes.io/projected/2a937aec-9d85-4924-b88f-69200cab4ee5-kube-api-access-rq6cb\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688153 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-auth-proxy-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-audit\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688190 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8zx4\" (UniqueName: \"kubernetes.io/projected/c48181d0-7322-418a-8f38-7e3450675f0e-kube-api-access-w8zx4\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe88c24-f4ac-410d-8692-81fe612083e7-serving-cert\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688239 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688263 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b447ef-4ff4-4f34-8b4d-144f6022029e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b447ef-4ff4-4f34-8b4d-144f6022029e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688310 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688336 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688357 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-encryption-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-machine-approver-tls\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688400 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-image-import-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688413 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-serving-cert\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688438 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688461 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688496 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a937aec-9d85-4924-b88f-69200cab4ee5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688524 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-audit-dir\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688541 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-client\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688560 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a937aec-9d85-4924-b88f-69200cab4ee5-serving-cert\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688576 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688599 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w864\" (UniqueName: \"kubernetes.io/projected/e3b447ef-4ff4-4f34-8b4d-144f6022029e-kube-api-access-5w864\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-config\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.688661 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9s7q\" (UniqueName: \"kubernetes.io/projected/bfe88c24-f4ac-410d-8692-81fe612083e7-kube-api-access-c9s7q\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.689727 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-serving-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.690463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-auth-proxy-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.690474 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.690913 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-audit\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.691227 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.691758 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-svxr4"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.692357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-node-pullsecrets\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.693079 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-service-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.693693 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.694046 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-config\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.694298 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a937aec-9d85-4924-b88f-69200cab4ee5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.695330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.696044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c48181d0-7322-418a-8f38-7e3450675f0e-audit-dir\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.696086 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.696113 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.696377 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.696783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfe88c24-f4ac-410d-8692-81fe612083e7-config\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.698019 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-image-import-ca\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.698345 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.699333 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.699518 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.700139 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.700217 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.700532 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.701274 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d7hb4"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.701919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe88c24-f4ac-410d-8692-81fe612083e7-serving-cert\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.702324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-encryption-config\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.702398 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfs6x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.703461 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9mn97"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.704495 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.707898 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a937aec-9d85-4924-b88f-69200cab4ee5-serving-cert\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.708610 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-serving-cert\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.710678 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.712810 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.712880 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.714126 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l82ft"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.716155 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-machine-approver-tls\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.716831 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.717404 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mzn92"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.717999 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.718101 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.718238 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.719635 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6g8m8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.720535 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.721203 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c48181d0-7322-418a-8f38-7e3450675f0e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.722418 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.723667 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2tvrt"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.724779 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.726898 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.727601 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c48181d0-7322-418a-8f38-7e3450675f0e-etcd-client\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.729723 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.733110 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.736275 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l82ft"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.736504 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.738732 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.740109 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bpn7q"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.740860 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.741639 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.746100 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-49t7q"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.756454 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.759191 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.759440 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.759562 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.761685 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.764782 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.766811 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.767622 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.768918 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.770310 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.771449 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.772849 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mzn92"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.773961 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-svxr4"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.775355 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zj6vq"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.776150 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.779470 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.781480 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5kk5x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.782978 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5kk5x"] Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.783089 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.790781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b447ef-4ff4-4f34-8b4d-144f6022029e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.790829 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b447ef-4ff4-4f34-8b4d-144f6022029e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.790896 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w864\" (UniqueName: \"kubernetes.io/projected/e3b447ef-4ff4-4f34-8b4d-144f6022029e-kube-api-access-5w864\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.792232 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b447ef-4ff4-4f34-8b4d-144f6022029e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.793713 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b447ef-4ff4-4f34-8b4d-144f6022029e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.795074 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.815506 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.835589 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.856999 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.875858 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.896077 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.915795 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.935339 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.956002 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.975995 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 07:25:29 crc kubenswrapper[4895]: I1202 07:25:29.996463 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.016347 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.036057 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.060701 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.076722 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.095989 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.115371 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.135009 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.140756 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.140756 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.155497 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.176395 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.197319 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.217346 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.237140 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.257113 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.276949 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.296818 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.316828 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.336303 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.357460 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.377849 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.396577 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.415759 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.436581 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.496598 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.516850 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.536678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.556519 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.575539 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.595947 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.617453 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.636357 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.656069 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.674121 4895 request.go:700] Waited for 1.012174502s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.676214 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.697266 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.715936 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.736913 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.756702 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.776454 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.796219 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.816629 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.835678 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.856890 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.906300 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9s7q\" (UniqueName: \"kubernetes.io/projected/bfe88c24-f4ac-410d-8692-81fe612083e7-kube-api-access-c9s7q\") pod \"authentication-operator-69f744f599-dz88z\" (UID: \"bfe88c24-f4ac-410d-8692-81fe612083e7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.927306 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccx5f\" (UniqueName: \"kubernetes.io/projected/46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97-kube-api-access-ccx5f\") pod \"machine-approver-56656f9798-q5k65\" (UID: \"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.949471 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq6cb\" (UniqueName: \"kubernetes.io/projected/2a937aec-9d85-4924-b88f-69200cab4ee5-kube-api-access-rq6cb\") pod \"openshift-config-operator-7777fb866f-s9km8\" (UID: \"2a937aec-9d85-4924-b88f-69200cab4ee5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.956918 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.964845 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8zx4\" (UniqueName: \"kubernetes.io/projected/c48181d0-7322-418a-8f38-7e3450675f0e-kube-api-access-w8zx4\") pod \"apiserver-76f77b778f-jfs6x\" (UID: \"c48181d0-7322-418a-8f38-7e3450675f0e\") " pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.976241 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 07:25:30 crc kubenswrapper[4895]: I1202 07:25:30.997681 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.016850 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.037007 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.056010 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.076660 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.094443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.101460 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.117206 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.135422 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.138109 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.140534 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.140636 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.149136 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.156442 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 07:25:31 crc kubenswrapper[4895]: W1202 07:25:31.175128 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f6dbb0_1eb8_4e4c_bf14_977fb7e4fd97.slice/crio-652b6e0cb7fee28cacbae1d83492a85205b2a8ef9caa6b463bcd49028aa2db96 WatchSource:0}: Error finding container 652b6e0cb7fee28cacbae1d83492a85205b2a8ef9caa6b463bcd49028aa2db96: Status 404 returned error can't find the container with id 652b6e0cb7fee28cacbae1d83492a85205b2a8ef9caa6b463bcd49028aa2db96 Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.176947 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.183287 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.196276 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.218013 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.237850 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.256659 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.277604 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.296929 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.319061 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.337502 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.356644 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.384172 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dz88z"] Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.386109 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.396889 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 07:25:31 crc kubenswrapper[4895]: W1202 07:25:31.415856 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe88c24_f4ac_410d_8692_81fe612083e7.slice/crio-df43ebfc5b10ee5f0c8db7ed88bb4642034393ac15bdec6281ac79606246dc09 WatchSource:0}: Error finding container df43ebfc5b10ee5f0c8db7ed88bb4642034393ac15bdec6281ac79606246dc09: Status 404 returned error can't find the container with id df43ebfc5b10ee5f0c8db7ed88bb4642034393ac15bdec6281ac79606246dc09 Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.416824 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.418114 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s9km8"] Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.440682 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jfs6x"] Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.441313 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 07:25:31 crc kubenswrapper[4895]: W1202 07:25:31.447100 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a937aec_9d85_4924_b88f_69200cab4ee5.slice/crio-9abd92a7ae4f1695f05b9e6d940c52ff9307d618446989f574308bdc147feada WatchSource:0}: Error finding container 9abd92a7ae4f1695f05b9e6d940c52ff9307d618446989f574308bdc147feada: Status 404 returned error can't find the container with id 9abd92a7ae4f1695f05b9e6d940c52ff9307d618446989f574308bdc147feada Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.455402 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.475635 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.496454 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.515540 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.536320 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.555972 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.576927 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.597092 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.615652 4895 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.637434 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.675829 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.678376 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w864\" (UniqueName: \"kubernetes.io/projected/e3b447ef-4ff4-4f34-8b4d-144f6022029e-kube-api-access-5w864\") pod \"openshift-controller-manager-operator-756b6f6bc6-482tt\" (UID: \"e3b447ef-4ff4-4f34-8b4d-144f6022029e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.694470 4895 request.go:700] Waited for 1.553213115s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&limit=500&resourceVersion=0 Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.696925 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.717610 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.737532 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.795943 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814624 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvth\" (UniqueName: \"kubernetes.io/projected/00f072cc-9501-499e-82b4-027e2d267930-kube-api-access-xlvth\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4cz\" (UniqueName: \"kubernetes.io/projected/eee3485d-8623-414b-8466-8da5e97c08b7-kube-api-access-5k4cz\") pod \"migrator-59844c95c7-lppp8\" (UID: \"eee3485d-8623-414b-8466-8da5e97c08b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814764 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0f113-d1df-4bc3-8f5a-b764f523272b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814784 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814806 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-stats-auth\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814843 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-config\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814874 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa0f113-d1df-4bc3-8f5a-b764f523272b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814891 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-serving-cert\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-client\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814964 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.814997 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815018 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815037 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815091 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815112 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301d951f-f6cc-4833-9e1c-0cf7f82424d3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815144 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85410206-3fcb-46c1-ac5d-bc3100b30544-metrics-tls\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815162 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815221 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815257 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61a073f-daaf-4b24-8e0d-2d4937aaa601-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d87e36ab-5b0c-4726-9539-e3bf256e63bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-images\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-metrics-certs\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815345 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwvx\" (UniqueName: \"kubernetes.io/projected/1569ea0e-ca30-4212-95e4-11dde6bca970-kube-api-access-dvwvx\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.815961 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 07:25:31 crc kubenswrapper[4895]: E1202 07:25:31.816160 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.316146659 +0000 UTC m=+143.487006262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.816894 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.816927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.816959 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.816980 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66c26\" (UniqueName: \"kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817033 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tklzt\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817053 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d210c48b-56d9-4385-86dc-da6b7a2cfbee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817108 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817126 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdc42\" (UniqueName: \"kubernetes.io/projected/02423153-ef82-4284-b703-cf006e0b8b70-kube-api-access-qdc42\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817146 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817163 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1569ea0e-ca30-4212-95e4-11dde6bca970-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817181 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817225 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28k8q\" (UniqueName: \"kubernetes.io/projected/cd87ba31-a340-47a1-a2db-3019015a2c24-kube-api-access-28k8q\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817304 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-config\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.817326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q545t\" (UniqueName: \"kubernetes.io/projected/85410206-3fcb-46c1-ac5d-bc3100b30544-kube-api-access-q545t\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819015 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5s67\" (UniqueName: \"kubernetes.io/projected/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-kube-api-access-z5s67\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819053 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtl54\" (UniqueName: \"kubernetes.io/projected/0e8ba2f7-f07b-4532-9620-00662d37f5b9-kube-api-access-gtl54\") pod \"downloads-7954f5f757-d7hb4\" (UID: \"0e8ba2f7-f07b-4532-9620-00662d37f5b9\") " pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-config\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819160 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819182 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819200 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt24s\" (UniqueName: \"kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819217 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911ed87d-b3fd-4002-b7b0-b720d7066459-config\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819237 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911ed87d-b3fd-4002-b7b0-b720d7066459-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/911ed87d-b3fd-4002-b7b0-b720d7066459-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819313 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819343 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02423153-ef82-4284-b703-cf006e0b8b70-service-ca-bundle\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819374 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5338ef9d-1a37-4d19-8481-0e0a1de24df4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819394 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819416 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fd9\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-kube-api-access-d9fd9\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2msbx\" (UniqueName: \"kubernetes.io/projected/d210c48b-56d9-4385-86dc-da6b7a2cfbee-kube-api-access-2msbx\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819494 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819553 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5338ef9d-1a37-4d19-8481-0e0a1de24df4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkff\" (UniqueName: \"kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819626 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819643 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f072cc-9501-499e-82b4-027e2d267930-serving-cert\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-config\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819713 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819735 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-trusted-ca\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819771 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-service-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819790 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87e36ab-5b0c-4726-9539-e3bf256e63bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819813 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzt6\" (UniqueName: \"kubernetes.io/projected/5338ef9d-1a37-4d19-8481-0e0a1de24df4-kube-api-access-chzt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819830 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-default-certificate\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819850 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/301d951f-f6cc-4833-9e1c-0cf7f82424d3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819865 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819884 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819903 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819919 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819935 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819966 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk2xd\" (UniqueName: \"kubernetes.io/projected/4fa0f113-d1df-4bc3-8f5a-b764f523272b-kube-api-access-mk2xd\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.819984 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqmv5\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-kube-api-access-dqmv5\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.820010 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61a073f-daaf-4b24-8e0d-2d4937aaa601-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.820028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61a073f-daaf-4b24-8e0d-2d4937aaa601-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.820044 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtd8j\" (UniqueName: \"kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.832921 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921378 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921585 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-srv-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921617 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921639 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhhgv\" (UniqueName: \"kubernetes.io/projected/acfc02d2-70d4-4e57-a457-509dc0c91437-kube-api-access-hhhgv\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921677 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6hv\" (UniqueName: \"kubernetes.io/projected/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-kube-api-access-lt6hv\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921695 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqmv5\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-kube-api-access-dqmv5\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-serving-cert\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921727 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk2xd\" (UniqueName: \"kubernetes.io/projected/4fa0f113-d1df-4bc3-8f5a-b764f523272b-kube-api-access-mk2xd\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921805 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61a073f-daaf-4b24-8e0d-2d4937aaa601-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921826 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtd8j\" (UniqueName: \"kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921852 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvth\" (UniqueName: \"kubernetes.io/projected/00f072cc-9501-499e-82b4-027e2d267930-kube-api-access-xlvth\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4cz\" (UniqueName: \"kubernetes.io/projected/eee3485d-8623-414b-8466-8da5e97c08b7-kube-api-access-5k4cz\") pod \"migrator-59844c95c7-lppp8\" (UID: \"eee3485d-8623-414b-8466-8da5e97c08b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921895 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921916 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0f113-d1df-4bc3-8f5a-b764f523272b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921933 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-config\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921960 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa0f113-d1df-4bc3-8f5a-b764f523272b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921981 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19dd2c2d-fd12-4884-96d3-50ef117553c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.921997 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-cabundle\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-config\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922035 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llw7n\" (UniqueName: \"kubernetes.io/projected/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-kube-api-access-llw7n\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922073 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdl4\" (UniqueName: \"kubernetes.io/projected/30d19465-967f-42ad-af2e-983465c989e1-kube-api-access-2tdl4\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922094 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922113 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922132 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922161 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61a073f-daaf-4b24-8e0d-2d4937aaa601-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922177 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d87e36ab-5b0c-4726-9539-e3bf256e63bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922194 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-images\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922212 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-registration-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922271 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922291 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tklzt\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922307 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922345 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1569ea0e-ca30-4212-95e4-11dde6bca970-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922382 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-mountpoint-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30d19465-967f-42ad-af2e-983465c989e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q545t\" (UniqueName: \"kubernetes.io/projected/85410206-3fcb-46c1-ac5d-bc3100b30544-kube-api-access-q545t\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922474 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922494 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-config\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtl54\" (UniqueName: \"kubernetes.io/projected/0e8ba2f7-f07b-4532-9620-00662d37f5b9-kube-api-access-gtl54\") pod \"downloads-7954f5f757-d7hb4\" (UID: \"0e8ba2f7-f07b-4532-9620-00662d37f5b9\") " pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922609 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02423153-ef82-4284-b703-cf006e0b8b70-service-ca-bundle\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-serving-cert\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5338ef9d-1a37-4d19-8481-0e0a1de24df4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922698 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fd9\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-kube-api-access-d9fd9\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922715 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc644\" (UniqueName: \"kubernetes.io/projected/74128b07-b5b9-4646-8780-5d28b3a715ae-kube-api-access-kc644\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922731 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmc86\" (UniqueName: \"kubernetes.io/projected/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-kube-api-access-mmc86\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922765 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgr4r\" (UniqueName: \"kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922783 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349b3a38-fe58-4c38-8008-ac5ba643ddef-tmpfs\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922821 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922838 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922857 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-trusted-ca\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-service-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzt6\" (UniqueName: \"kubernetes.io/projected/5338ef9d-1a37-4d19-8481-0e0a1de24df4-kube-api-access-chzt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-default-certificate\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-srv-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922952 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/301d951f-f6cc-4833-9e1c-0cf7f82424d3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922969 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.922987 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-dir\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923018 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-apiservice-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923072 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-client\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923105 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61a073f-daaf-4b24-8e0d-2d4937aaa601-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923150 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-stats-auth\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923171 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-serving-cert\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923187 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923203 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-client\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923219 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923262 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-csi-data-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85410206-3fcb-46c1-ac5d-bc3100b30544-metrics-tls\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923300 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k668w\" (UniqueName: \"kubernetes.io/projected/19dd2c2d-fd12-4884-96d3-50ef117553c7-kube-api-access-k668w\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923321 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22h7q\" (UniqueName: \"kubernetes.io/projected/d7b05beb-8c1c-4f69-bf13-199dbf869413-kube-api-access-22h7q\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301d951f-f6cc-4833-9e1c-0cf7f82424d3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923367 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923384 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923403 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spcdb\" (UniqueName: \"kubernetes.io/projected/168c3e01-42d8-4684-b160-23c5eb559a98-kube-api-access-spcdb\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923422 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2v96\" (UniqueName: \"kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923442 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-images\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923460 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-metrics-certs\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923480 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/168c3e01-42d8-4684-b160-23c5eb559a98-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923500 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-node-bootstrap-token\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923517 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-config-volume\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923533 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcd7\" (UniqueName: \"kubernetes.io/projected/349b3a38-fe58-4c38-8008-ac5ba643ddef-kube-api-access-8gcd7\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923552 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwvx\" (UniqueName: \"kubernetes.io/projected/1569ea0e-ca30-4212-95e4-11dde6bca970-kube-api-access-dvwvx\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923572 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923595 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66c26\" (UniqueName: \"kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923613 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-plugins-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d210c48b-56d9-4385-86dc-da6b7a2cfbee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923661 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923686 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdc42\" (UniqueName: \"kubernetes.io/projected/02423153-ef82-4284-b703-cf006e0b8b70-kube-api-access-qdc42\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923723 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923755 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28k8q\" (UniqueName: \"kubernetes.io/projected/cd87ba31-a340-47a1-a2db-3019015a2c24-kube-api-access-28k8q\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-config\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923790 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5s67\" (UniqueName: \"kubernetes.io/projected/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-kube-api-access-z5s67\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923828 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923865 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt24s\" (UniqueName: \"kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923904 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911ed87d-b3fd-4002-b7b0-b720d7066459-config\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923922 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25eaa26b-3997-47d7-9932-8eff551bc799-cert\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-proxy-tls\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923961 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911ed87d-b3fd-4002-b7b0-b720d7066459-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923977 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-webhook-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.923992 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-key\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/911ed87d-b3fd-4002-b7b0-b720d7066459-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924047 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-encryption-config\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924065 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2msbx\" (UniqueName: \"kubernetes.io/projected/d210c48b-56d9-4385-86dc-da6b7a2cfbee-kube-api-access-2msbx\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924142 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-socket-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924183 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5338ef9d-1a37-4d19-8481-0e0a1de24df4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkff\" (UniqueName: \"kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-metrics-tls\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924237 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924253 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f072cc-9501-499e-82b4-027e2d267930-serving-cert\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924273 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-config\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924290 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgvph\" (UniqueName: \"kubernetes.io/projected/25eaa26b-3997-47d7-9932-8eff551bc799-kube-api-access-zgvph\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924306 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-certs\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924322 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-626jq\" (UniqueName: \"kubernetes.io/projected/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-kube-api-access-626jq\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924343 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87e36ab-5b0c-4726-9539-e3bf256e63bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq2d8\" (UniqueName: \"kubernetes.io/projected/0752165f-320d-4555-8ac0-5cf99cd6194e-kube-api-access-sq2d8\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924377 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsmcx\" (UniqueName: \"kubernetes.io/projected/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-kube-api-access-qsmcx\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-policies\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.924421 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.925710 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: E1202 07:25:31.925959 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.425934092 +0000 UTC m=+143.596793705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.926452 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.926876 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.927427 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.927629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.930368 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.931115 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-service-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.932001 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-config\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.932062 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02423153-ef82-4284-b703-cf006e0b8b70-service-ca-bundle\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.933105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-trusted-ca\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.933786 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.934861 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-ca\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.935053 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.939969 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0f113-d1df-4bc3-8f5a-b764f523272b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.940003 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00f072cc-9501-499e-82b4-027e2d267930-serving-cert\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.940626 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f072cc-9501-499e-82b4-027e2d267930-config\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.940872 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd87ba31-a340-47a1-a2db-3019015a2c24-config\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.940953 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-serving-cert\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.941296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.941975 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.942469 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61a073f-daaf-4b24-8e0d-2d4937aaa601-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.941990 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1569ea0e-ca30-4212-95e4-11dde6bca970-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.943469 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87e36ab-5b0c-4726-9539-e3bf256e63bc-trusted-ca\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.945264 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61a073f-daaf-4b24-8e0d-2d4937aaa601-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.945574 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-config\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.946020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1569ea0e-ca30-4212-95e4-11dde6bca970-images\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.946248 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/911ed87d-b3fd-4002-b7b0-b720d7066459-config\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.946378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.946722 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/911ed87d-b3fd-4002-b7b0-b720d7066459-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.950730 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.951935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5338ef9d-1a37-4d19-8481-0e0a1de24df4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.952043 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/301d951f-f6cc-4833-9e1c-0cf7f82424d3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.953203 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.954482 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.954782 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/301d951f-f6cc-4833-9e1c-0cf7f82424d3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.956373 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cd87ba31-a340-47a1-a2db-3019015a2c24-etcd-client\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.956841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.957230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-default-certificate\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.962650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.963336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.964241 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.964814 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d87e36ab-5b0c-4726-9539-e3bf256e63bc-metrics-tls\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.970878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.971372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.972926 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.973433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-stats-auth\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.973730 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.975019 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85410206-3fcb-46c1-ac5d-bc3100b30544-metrics-tls\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.982356 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.990479 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.990627 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.991205 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.995454 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.996442 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.999466 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:31 crc kubenswrapper[4895]: I1202 07:25:31.999480 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.000155 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.000344 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.000379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.000677 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5338ef9d-1a37-4d19-8481-0e0a1de24df4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.000997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02423153-ef82-4284-b703-cf006e0b8b70-metrics-certs\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.001202 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d210c48b-56d9-4385-86dc-da6b7a2cfbee-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.005772 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" event={"ID":"c48181d0-7322-418a-8f38-7e3450675f0e","Type":"ContainerStarted","Data":"e51e5ac65f7ade7ce8e9099e0e75a0be1cd2cfed1c4ea53a0f731fdbb7765dd3"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.007394 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fa0f113-d1df-4bc3-8f5a-b764f523272b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.011930 4895 generic.go:334] "Generic (PLEG): container finished" podID="2a937aec-9d85-4924-b88f-69200cab4ee5" containerID="ec68b78cffa5438fc915e8afb5db676b2c84ecf9f2d30ee0020ee40f61d3dd70" exitCode=0 Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.012142 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" event={"ID":"2a937aec-9d85-4924-b88f-69200cab4ee5","Type":"ContainerDied","Data":"ec68b78cffa5438fc915e8afb5db676b2c84ecf9f2d30ee0020ee40f61d3dd70"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.012191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" event={"ID":"2a937aec-9d85-4924-b88f-69200cab4ee5","Type":"ContainerStarted","Data":"9abd92a7ae4f1695f05b9e6d940c52ff9307d618446989f574308bdc147feada"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.012860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2msbx\" (UniqueName: \"kubernetes.io/projected/d210c48b-56d9-4385-86dc-da6b7a2cfbee-kube-api-access-2msbx\") pod \"cluster-samples-operator-665b6dd947-xhq8l\" (UID: \"d210c48b-56d9-4385-86dc-da6b7a2cfbee\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.014944 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" event={"ID":"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97","Type":"ContainerStarted","Data":"9af15f589eebb77f6b3d5d236a4ef17feda4ebe4e94f5e9f432272a5bbb440ea"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.014970 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" event={"ID":"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97","Type":"ContainerStarted","Data":"652b6e0cb7fee28cacbae1d83492a85205b2a8ef9caa6b463bcd49028aa2db96"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.017367 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fd9\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-kube-api-access-d9fd9\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.018437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" event={"ID":"bfe88c24-f4ac-410d-8692-81fe612083e7","Type":"ContainerStarted","Data":"384392b66c6d2303ffa9ee134ecf7dc3799ce4216ef719891ef0104b222432dc"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.018527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" event={"ID":"bfe88c24-f4ac-410d-8692-81fe612083e7","Type":"ContainerStarted","Data":"df43ebfc5b10ee5f0c8db7ed88bb4642034393ac15bdec6281ac79606246dc09"} Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.026863 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq2d8\" (UniqueName: \"kubernetes.io/projected/0752165f-320d-4555-8ac0-5cf99cd6194e-kube-api-access-sq2d8\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.026916 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsmcx\" (UniqueName: \"kubernetes.io/projected/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-kube-api-access-qsmcx\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.026946 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-policies\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.026973 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.026997 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-srv-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6hv\" (UniqueName: \"kubernetes.io/projected/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-kube-api-access-lt6hv\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027046 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhhgv\" (UniqueName: \"kubernetes.io/projected/acfc02d2-70d4-4e57-a457-509dc0c91437-kube-api-access-hhhgv\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-serving-cert\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027122 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027193 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-cabundle\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027221 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19dd2c2d-fd12-4884-96d3-50ef117553c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027245 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-config\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027270 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llw7n\" (UniqueName: \"kubernetes.io/projected/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-kube-api-access-llw7n\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdl4\" (UniqueName: \"kubernetes.io/projected/30d19465-967f-42ad-af2e-983465c989e1-kube-api-access-2tdl4\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027353 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-images\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027387 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-registration-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027413 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027447 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027471 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-mountpoint-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027491 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30d19465-967f-42ad-af2e-983465c989e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027560 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-serving-cert\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027607 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc644\" (UniqueName: \"kubernetes.io/projected/74128b07-b5b9-4646-8780-5d28b3a715ae-kube-api-access-kc644\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027637 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmc86\" (UniqueName: \"kubernetes.io/projected/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-kube-api-access-mmc86\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027661 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgr4r\" (UniqueName: \"kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349b3a38-fe58-4c38-8008-ac5ba643ddef-tmpfs\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-srv-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-dir\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027812 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-apiservice-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027834 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027859 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-client\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027899 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-csi-data-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027954 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k668w\" (UniqueName: \"kubernetes.io/projected/19dd2c2d-fd12-4884-96d3-50ef117553c7-kube-api-access-k668w\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.027981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22h7q\" (UniqueName: \"kubernetes.io/projected/d7b05beb-8c1c-4f69-bf13-199dbf869413-kube-api-access-22h7q\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028007 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spcdb\" (UniqueName: \"kubernetes.io/projected/168c3e01-42d8-4684-b160-23c5eb559a98-kube-api-access-spcdb\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2v96\" (UniqueName: \"kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/168c3e01-42d8-4684-b160-23c5eb559a98-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028085 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-node-bootstrap-token\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028116 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcd7\" (UniqueName: \"kubernetes.io/projected/349b3a38-fe58-4c38-8008-ac5ba643ddef-kube-api-access-8gcd7\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028140 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-config-volume\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028168 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-plugins-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028299 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028324 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028342 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-proxy-tls\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028376 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25eaa26b-3997-47d7-9932-8eff551bc799-cert\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-key\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028421 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-webhook-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028462 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-encryption-config\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028485 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-socket-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-metrics-tls\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028568 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgvph\" (UniqueName: \"kubernetes.io/projected/25eaa26b-3997-47d7-9932-8eff551bc799-kube-api-access-zgvph\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028592 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-certs\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.028618 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-626jq\" (UniqueName: \"kubernetes.io/projected/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-kube-api-access-626jq\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.029484 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-policies\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.030375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-mountpoint-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.030607 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/349b3a38-fe58-4c38-8008-ac5ba643ddef-tmpfs\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.031146 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.031548 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-images\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.031880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-registration-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.032485 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-cabundle\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.032756 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28k8q\" (UniqueName: \"kubernetes.io/projected/cd87ba31-a340-47a1-a2db-3019015a2c24-kube-api-access-28k8q\") pod \"etcd-operator-b45778765-2tvrt\" (UID: \"cd87ba31-a340-47a1-a2db-3019015a2c24\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.032803 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.033492 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7b05beb-8c1c-4f69-bf13-199dbf869413-audit-dir\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.033515 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-config-volume\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.033908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.034463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-config\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.035161 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30d19465-967f-42ad-af2e-983465c989e1-proxy-tls\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.036148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-srv-cert\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.036238 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-socket-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.039150 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30d19465-967f-42ad-af2e-983465c989e1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.039928 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-serving-cert\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.040051 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-serving-cert\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.040238 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.54017333 +0000 UTC m=+143.711032943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.040285 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-csi-data-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.040320 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-profile-collector-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.040524 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0752165f-320d-4555-8ac0-5cf99cd6194e-plugins-dir\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.041494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.041784 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.041843 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b05beb-8c1c-4f69-bf13-199dbf869413-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.045637 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-webhook-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.045921 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-proxy-tls\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.046141 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-encryption-config\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.046877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.046888 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7b05beb-8c1c-4f69-bf13-199dbf869413-etcd-client\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.047353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-srv-cert\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.047463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-node-bootstrap-token\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.048361 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.049044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/acfc02d2-70d4-4e57-a457-509dc0c91437-certs\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.049495 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/168c3e01-42d8-4684-b160-23c5eb559a98-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.049526 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/349b3a38-fe58-4c38-8008-ac5ba643ddef-apiservice-cert\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.050820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.053857 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqmv5\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-kube-api-access-dqmv5\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.056613 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/19dd2c2d-fd12-4884-96d3-50ef117553c7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.057133 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/74128b07-b5b9-4646-8780-5d28b3a715ae-signing-key\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.058270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25eaa26b-3997-47d7-9932-8eff551bc799-cert\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.059606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-metrics-tls\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.073073 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk2xd\" (UniqueName: \"kubernetes.io/projected/4fa0f113-d1df-4bc3-8f5a-b764f523272b-kube-api-access-mk2xd\") pod \"openshift-apiserver-operator-796bbdcf4f-6zlkc\" (UID: \"4fa0f113-d1df-4bc3-8f5a-b764f523272b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.086544 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.095504 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwvx\" (UniqueName: \"kubernetes.io/projected/1569ea0e-ca30-4212-95e4-11dde6bca970-kube-api-access-dvwvx\") pod \"machine-api-operator-5694c8668f-49t7q\" (UID: \"1569ea0e-ca30-4212-95e4-11dde6bca970\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.110329 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzt6\" (UniqueName: \"kubernetes.io/projected/5338ef9d-1a37-4d19-8481-0e0a1de24df4-kube-api-access-chzt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7rlb\" (UID: \"5338ef9d-1a37-4d19-8481-0e0a1de24df4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.129925 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.129978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5s67\" (UniqueName: \"kubernetes.io/projected/c27769e4-b2f8-4947-96c9-b90bfce6ff0d-kube-api-access-z5s67\") pod \"control-plane-machine-set-operator-78cbb6b69f-bvgr6\" (UID: \"c27769e4-b2f8-4947-96c9-b90bfce6ff0d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.130055 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.630040593 +0000 UTC m=+143.800900206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.130716 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.132062 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.632052211 +0000 UTC m=+143.802911814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.149089 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/301d951f-f6cc-4833-9e1c-0cf7f82424d3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpqhf\" (UID: \"301d951f-f6cc-4833-9e1c-0cf7f82424d3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.154824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" Dec 02 07:25:32 crc kubenswrapper[4895]: W1202 07:25:32.167966 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b447ef_4ff4_4f34_8b4d_144f6022029e.slice/crio-40aec1d3d32faa3ddbf3f1eba9aaa3c9b810a0eee2c9378e79f2f0a7848c6a82 WatchSource:0}: Error finding container 40aec1d3d32faa3ddbf3f1eba9aaa3c9b810a0eee2c9378e79f2f0a7848c6a82: Status 404 returned error can't find the container with id 40aec1d3d32faa3ddbf3f1eba9aaa3c9b810a0eee2c9378e79f2f0a7848c6a82 Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.173894 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkff\" (UniqueName: \"kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff\") pod \"route-controller-manager-6576b87f9c-jd7nh\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.188680 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.194289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0554f1d4-d22d-47f4-9a38-5c2985fb0cc3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tcbw8\" (UID: \"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.213649 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q545t\" (UniqueName: \"kubernetes.io/projected/85410206-3fcb-46c1-ac5d-bc3100b30544-kube-api-access-q545t\") pod \"dns-operator-744455d44c-9mn97\" (UID: \"85410206-3fcb-46c1-ac5d-bc3100b30544\") " pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.218233 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.231434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d87e36ab-5b0c-4726-9539-e3bf256e63bc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7d79x\" (UID: \"d87e36ab-5b0c-4726-9539-e3bf256e63bc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.232114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.232232 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.732209013 +0000 UTC m=+143.903068626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.232370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.233197 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.233648 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.733633262 +0000 UTC m=+143.904492875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.253958 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtl54\" (UniqueName: \"kubernetes.io/projected/0e8ba2f7-f07b-4532-9620-00662d37f5b9-kube-api-access-gtl54\") pod \"downloads-7954f5f757-d7hb4\" (UID: \"0e8ba2f7-f07b-4532-9620-00662d37f5b9\") " pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.258462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.277154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/911ed87d-b3fd-4002-b7b0-b720d7066459-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rcln9\" (UID: \"911ed87d-b3fd-4002-b7b0-b720d7066459\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.281032 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.287682 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.292364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66c26\" (UniqueName: \"kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26\") pod \"console-f9d7485db-q7mhm\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.299824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.303191 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.323702 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdc42\" (UniqueName: \"kubernetes.io/projected/02423153-ef82-4284-b703-cf006e0b8b70-kube-api-access-qdc42\") pod \"router-default-5444994796-lf9fx\" (UID: \"02423153-ef82-4284-b703-cf006e0b8b70\") " pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.334771 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.334942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtd8j\" (UniqueName: \"kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j\") pod \"controller-manager-879f6c89f-rd9pn\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.335399 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.8353722 +0000 UTC m=+144.006231993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.355621 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvth\" (UniqueName: \"kubernetes.io/projected/00f072cc-9501-499e-82b4-027e2d267930-kube-api-access-xlvth\") pod \"console-operator-58897d9998-6g8m8\" (UID: \"00f072cc-9501-499e-82b4-027e2d267930\") " pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.369609 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.374840 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4cz\" (UniqueName: \"kubernetes.io/projected/eee3485d-8623-414b-8466-8da5e97c08b7-kube-api-access-5k4cz\") pod \"migrator-59844c95c7-lppp8\" (UID: \"eee3485d-8623-414b-8466-8da5e97c08b7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.393263 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt24s\" (UniqueName: \"kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s\") pod \"oauth-openshift-558db77b4-nv29v\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.400733 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.425906 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.435997 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.436552 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:32.93653552 +0000 UTC m=+144.107395133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.440507 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tklzt\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.441289 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.460711 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61a073f-daaf-4b24-8e0d-2d4937aaa601-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qp9gz\" (UID: \"b61a073f-daaf-4b24-8e0d-2d4937aaa601\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.475301 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.478245 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgr4r\" (UniqueName: \"kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r\") pod \"collect-profiles-29410995-cbx2h\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.483253 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.495453 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.503270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq2d8\" (UniqueName: \"kubernetes.io/projected/0752165f-320d-4555-8ac0-5cf99cd6194e-kube-api-access-sq2d8\") pod \"csi-hostpathplugin-5kk5x\" (UID: \"0752165f-320d-4555-8ac0-5cf99cd6194e\") " pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.512556 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.525093 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.526001 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsmcx\" (UniqueName: \"kubernetes.io/projected/f7d85ea2-5fcb-4744-bd7b-fa309a774ab4-kube-api-access-qsmcx\") pod \"olm-operator-6b444d44fb-ld28g\" (UID: \"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.538410 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.539101 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.039074769 +0000 UTC m=+144.209934392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.540734 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.547354 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.547696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-626jq\" (UniqueName: \"kubernetes.io/projected/80353fb9-e36f-4a78-a2e5-8c11d25a94f2-kube-api-access-626jq\") pod \"machine-config-controller-84d6567774-4n5nx\" (UID: \"80353fb9-e36f-4a78-a2e5-8c11d25a94f2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.570385 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.571080 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.578965 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdl4\" (UniqueName: \"kubernetes.io/projected/30d19465-967f-42ad-af2e-983465c989e1-kube-api-access-2tdl4\") pod \"machine-config-operator-74547568cd-fz6sl\" (UID: \"30d19465-967f-42ad-af2e-983465c989e1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.580529 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6hv\" (UniqueName: \"kubernetes.io/projected/f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c-kube-api-access-lt6hv\") pod \"service-ca-operator-777779d784-dvkjf\" (UID: \"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.594248 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhhgv\" (UniqueName: \"kubernetes.io/projected/acfc02d2-70d4-4e57-a457-509dc0c91437-kube-api-access-hhhgv\") pod \"machine-config-server-bpn7q\" (UID: \"acfc02d2-70d4-4e57-a457-509dc0c91437\") " pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.621563 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.624606 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llw7n\" (UniqueName: \"kubernetes.io/projected/3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22-kube-api-access-llw7n\") pod \"catalog-operator-68c6474976-fsr4s\" (UID: \"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.628684 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.634445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc644\" (UniqueName: \"kubernetes.io/projected/74128b07-b5b9-4646-8780-5d28b3a715ae-kube-api-access-kc644\") pod \"service-ca-9c57cc56f-svxr4\" (UID: \"74128b07-b5b9-4646-8780-5d28b3a715ae\") " pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.640658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.641259 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.141227678 +0000 UTC m=+144.312087291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.643944 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.652221 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.660991 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.671265 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmc86\" (UniqueName: \"kubernetes.io/projected/8eae663a-aaa8-488b-b46c-3ce28f7e0bb0-kube-api-access-mmc86\") pod \"dns-default-mzn92\" (UID: \"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0\") " pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.678617 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgvph\" (UniqueName: \"kubernetes.io/projected/25eaa26b-3997-47d7-9932-8eff551bc799-kube-api-access-zgvph\") pod \"ingress-canary-l82ft\" (UID: \"25eaa26b-3997-47d7-9932-8eff551bc799\") " pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.693044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spcdb\" (UniqueName: \"kubernetes.io/projected/168c3e01-42d8-4684-b160-23c5eb559a98-kube-api-access-spcdb\") pod \"multus-admission-controller-857f4d67dd-zj6vq\" (UID: \"168c3e01-42d8-4684-b160-23c5eb559a98\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.693275 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.693782 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.713088 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.714175 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22h7q\" (UniqueName: \"kubernetes.io/projected/d7b05beb-8c1c-4f69-bf13-199dbf869413-kube-api-access-22h7q\") pod \"apiserver-7bbb656c7d-wdmt6\" (UID: \"d7b05beb-8c1c-4f69-bf13-199dbf869413\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.723073 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l82ft" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.734356 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bpn7q" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.742053 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.742470 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.242452361 +0000 UTC m=+144.413311974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.750307 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2v96\" (UniqueName: \"kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96\") pod \"marketplace-operator-79b997595-zkq6j\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.754489 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcd7\" (UniqueName: \"kubernetes.io/projected/349b3a38-fe58-4c38-8008-ac5ba643ddef-kube-api-access-8gcd7\") pod \"packageserver-d55dfcdfc-vs2rc\" (UID: \"349b3a38-fe58-4c38-8008-ac5ba643ddef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.763456 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.777437 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k668w\" (UniqueName: \"kubernetes.io/projected/19dd2c2d-fd12-4884-96d3-50ef117553c7-kube-api-access-k668w\") pod \"package-server-manager-789f6589d5-qrngv\" (UID: \"19dd2c2d-fd12-4884-96d3-50ef117553c7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.824436 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.850465 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.850897 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.350880355 +0000 UTC m=+144.521739958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.859491 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.862778 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-49t7q"] Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.927760 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.935516 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.952520 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:32 crc kubenswrapper[4895]: E1202 07:25:32.953031 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.453006353 +0000 UTC m=+144.623865966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.990259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:32 crc kubenswrapper[4895]: I1202 07:25:32.991221 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.026684 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" event={"ID":"46f6dbb0-1eb8-4e4c-bf14-977fb7e4fd97","Type":"ContainerStarted","Data":"7be65d4a8ec29c1cf1ad5139d051f2a62e7e606a4a889814fb530b4e99d65d90"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.028792 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.038575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" event={"ID":"5338ef9d-1a37-4d19-8481-0e0a1de24df4","Type":"ContainerStarted","Data":"934d8ddb9e79f4fc2c974a034011deb825926aa6356592c9edd461e1160b7d0c"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.058690 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.059143 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.559126362 +0000 UTC m=+144.729985975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.064810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" event={"ID":"adee7e4a-d71b-4efc-b3fa-6e3ece833722","Type":"ContainerStarted","Data":"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.064905 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" event={"ID":"adee7e4a-d71b-4efc-b3fa-6e3ece833722","Type":"ContainerStarted","Data":"97c318a4a8350ed22c9957c16973494812c6b50b742ea1eaad55560480ff2b12"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.065447 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.072999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" event={"ID":"301d951f-f6cc-4833-9e1c-0cf7f82424d3","Type":"ContainerStarted","Data":"30545bd2f074d67b70ed363db82459a238aacac0e61fa2f22fd153dfddc23f17"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.073080 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" event={"ID":"301d951f-f6cc-4833-9e1c-0cf7f82424d3","Type":"ContainerStarted","Data":"818e8e74df0d27c4c362bfe6fcb363b04c71051b60f1942d5bb52d8daced9b34"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.076440 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lf9fx" event={"ID":"02423153-ef82-4284-b703-cf006e0b8b70","Type":"ContainerStarted","Data":"4a2eb6cfb407a7c822b0046a2a03517b22f92cd9a35d2f541d6fc51b00e18596"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.090998 4895 generic.go:334] "Generic (PLEG): container finished" podID="c48181d0-7322-418a-8f38-7e3450675f0e" containerID="fb8f2a81f54e5c65d2cdd93b5e2c19883b68718b6396f9807dc5ca8fda589c08" exitCode=0 Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.091573 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" event={"ID":"c48181d0-7322-418a-8f38-7e3450675f0e","Type":"ContainerDied","Data":"fb8f2a81f54e5c65d2cdd93b5e2c19883b68718b6396f9807dc5ca8fda589c08"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.101087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" event={"ID":"2a937aec-9d85-4924-b88f-69200cab4ee5","Type":"ContainerStarted","Data":"403edc3d8e32d25dfebc314647ab062d11f35df81cf50a5a85a062fe76c7faad"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.101192 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.108267 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" event={"ID":"e3b447ef-4ff4-4f34-8b4d-144f6022029e","Type":"ContainerStarted","Data":"faca82cca9e37ddb9931a308e2555c1688da95b50d3cb37fe64649e2098d4333"} Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.108356 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" event={"ID":"e3b447ef-4ff4-4f34-8b4d-144f6022029e","Type":"ContainerStarted","Data":"40aec1d3d32faa3ddbf3f1eba9aaa3c9b810a0eee2c9378e79f2f0a7848c6a82"} Dec 02 07:25:33 crc kubenswrapper[4895]: W1202 07:25:33.113954 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacfc02d2_70d4_4e57_a457_509dc0c91437.slice/crio-f9338b4018997264fb2c4ada7a4ce4745f9fbcd32029c12cbdc5a4057c49902e WatchSource:0}: Error finding container f9338b4018997264fb2c4ada7a4ce4745f9fbcd32029c12cbdc5a4057c49902e: Status 404 returned error can't find the container with id f9338b4018997264fb2c4ada7a4ce4745f9fbcd32029c12cbdc5a4057c49902e Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.160140 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.160663 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.660641503 +0000 UTC m=+144.831501126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.161086 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.167636 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.66761617 +0000 UTC m=+144.838475783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.189211 4895 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jd7nh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.189290 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.251375 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2tvrt"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.262356 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.274307 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.276035 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.776018474 +0000 UTC m=+144.946878087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.276319 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.299874 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.378590 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.379095 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.879078177 +0000 UTC m=+145.049937790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.398520 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9mn97"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.446866 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:25:33 crc kubenswrapper[4895]: W1202 07:25:33.450078 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0554f1d4_d22d_47f4_9a38_5c2985fb0cc3.slice/crio-c61491546a5593923a741f2862aa1001bcb29bc8c1e2e9621ac89ebcd65b6abe WatchSource:0}: Error finding container c61491546a5593923a741f2862aa1001bcb29bc8c1e2e9621ac89ebcd65b6abe: Status 404 returned error can't find the container with id c61491546a5593923a741f2862aa1001bcb29bc8c1e2e9621ac89ebcd65b6abe Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.480029 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.480733 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:33.980711101 +0000 UTC m=+145.151570714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.583169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.583683 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.083669392 +0000 UTC m=+145.254529005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.652606 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.662867 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.677785 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d7hb4"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.684170 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.684386 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.18436155 +0000 UTC m=+145.355221153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.685383 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8"] Dec 02 07:25:33 crc kubenswrapper[4895]: I1202 07:25:33.685642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:33 crc kubenswrapper[4895]: E1202 07:25:33.686054 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.186046318 +0000 UTC m=+145.356905931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:33.787522 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:33.787677 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.287653 +0000 UTC m=+145.458512623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:33.787722 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:33.787985 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.287978879 +0000 UTC m=+145.458838492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:33.889352 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:33.889567 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.389533422 +0000 UTC m=+145.560393035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:33.889634 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:33.890014 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.390006695 +0000 UTC m=+145.560866308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:33.991212 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:33.991643 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.491626998 +0000 UTC m=+145.662486611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.094509 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.095164 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.595138664 +0000 UTC m=+145.765998317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.196379 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.196780 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.696757178 +0000 UTC m=+145.867616801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.235783 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" event={"ID":"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3","Type":"ContainerStarted","Data":"c61491546a5593923a741f2862aa1001bcb29bc8c1e2e9621ac89ebcd65b6abe"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.238616 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q7mhm" event={"ID":"ddcbf4b8-5804-4136-8554-6a307825a6ed","Type":"ContainerStarted","Data":"90a8378c6c7b223b8b65bf85ecfd66e0fd00d4110b86a8ed69270d49e396428e"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.242067 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bpn7q" event={"ID":"acfc02d2-70d4-4e57-a457-509dc0c91437","Type":"ContainerStarted","Data":"f9338b4018997264fb2c4ada7a4ce4745f9fbcd32029c12cbdc5a4057c49902e"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.249170 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" event={"ID":"cd87ba31-a340-47a1-a2db-3019015a2c24","Type":"ContainerStarted","Data":"2155474871fe44c9e61122f937dedb105baa79c46aa1dd9d0ff9e0cd2f93f6a3"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.251036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" event={"ID":"85410206-3fcb-46c1-ac5d-bc3100b30544","Type":"ContainerStarted","Data":"4562a46a04cd0f0a6eb3919f674ea84d2978ef84ca8494e327d1bf0688954e4b"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.253645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" event={"ID":"1569ea0e-ca30-4212-95e4-11dde6bca970","Type":"ContainerStarted","Data":"86a33f2d7eec60b7b548b8d7613e042faeda80859f831a63912997d93a88058c"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.261952 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" event={"ID":"c27769e4-b2f8-4947-96c9-b90bfce6ff0d","Type":"ContainerStarted","Data":"9fa021e06f661cb801bf5ba25e8ce8eb902451cdf894f9d610bff9af448c309e"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.267215 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q5k65" podStartSLOduration=117.267195366 podStartE2EDuration="1m57.267195366s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.264495971 +0000 UTC m=+145.435355604" watchObservedRunningTime="2025-12-02 07:25:34.267195366 +0000 UTC m=+145.438054999" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.277089 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" event={"ID":"4fa0f113-d1df-4bc3-8f5a-b764f523272b","Type":"ContainerStarted","Data":"7e489ec99aeb670f3a10cf97984a4f7ef05acc06a610d54da78285e5e1e1e610"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.283435 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lf9fx" event={"ID":"02423153-ef82-4284-b703-cf006e0b8b70","Type":"ContainerStarted","Data":"0cda26523ad08878dd5172ec640f37c65217b937867ae74a521fc864cc6527ad"} Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.301282 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.301798 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.801773208 +0000 UTC m=+145.972632891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.311164 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" podStartSLOduration=117.31114 podStartE2EDuration="1m57.31114s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.308029923 +0000 UTC m=+145.478889546" watchObservedRunningTime="2025-12-02 07:25:34.31114 +0000 UTC m=+145.481999633" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.360054 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" podStartSLOduration=116.360037254 podStartE2EDuration="1m56.360037254s" podCreationTimestamp="2025-12-02 07:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.35816791 +0000 UTC m=+145.529027523" watchObservedRunningTime="2025-12-02 07:25:34.360037254 +0000 UTC m=+145.530896867" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.419967 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.428155 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.927971391 +0000 UTC m=+146.098831004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.430597 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.442047 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:34.942022165 +0000 UTC m=+146.112881968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.496615 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dz88z" podStartSLOduration=117.496589458 podStartE2EDuration="1m57.496589458s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.426776777 +0000 UTC m=+145.597636400" watchObservedRunningTime="2025-12-02 07:25:34.496589458 +0000 UTC m=+145.667449071" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.541682 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.542188 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.042167458 +0000 UTC m=+146.213027071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.548436 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-482tt" podStartSLOduration=117.548400152 podStartE2EDuration="1m57.548400152s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.497874544 +0000 UTC m=+145.668734157" watchObservedRunningTime="2025-12-02 07:25:34.548400152 +0000 UTC m=+145.719259775" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.577450 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpqhf" podStartSLOduration=117.577430228 podStartE2EDuration="1m57.577430228s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.574497395 +0000 UTC m=+145.745357018" watchObservedRunningTime="2025-12-02 07:25:34.577430228 +0000 UTC m=+145.748289861" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.581908 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.601320 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lf9fx" podStartSLOduration=117.601260897 podStartE2EDuration="1m57.601260897s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:34.598116299 +0000 UTC m=+145.768975932" watchObservedRunningTime="2025-12-02 07:25:34.601260897 +0000 UTC m=+145.772120510" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.642987 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:34 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:34 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:34 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.643101 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.647922 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.648403 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.1483778 +0000 UTC m=+146.319237423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.738972 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s"] Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.748641 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6g8m8"] Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.749357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.749560 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.249536091 +0000 UTC m=+146.420395704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.750302 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.750791 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.250769985 +0000 UTC m=+146.421629808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.778427 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x"] Dec 02 07:25:34 crc kubenswrapper[4895]: W1202 07:25:34.778559 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe4d45d_dc04_4b2b_ad77_e3f4dad62c22.slice/crio-2d6edd7b5b965a7c79007bcc2f28cfd20001a39cfccdfc74b64c38644fcb62cc WatchSource:0}: Error finding container 2d6edd7b5b965a7c79007bcc2f28cfd20001a39cfccdfc74b64c38644fcb62cc: Status 404 returned error can't find the container with id 2d6edd7b5b965a7c79007bcc2f28cfd20001a39cfccdfc74b64c38644fcb62cc Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.783853 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:25:34 crc kubenswrapper[4895]: W1202 07:25:34.796223 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f072cc_9501_499e_82b4_027e2d267930.slice/crio-b99304f0d77de2753ee8376537960b6d127fa001c9025a9859cb22039ae8f185 WatchSource:0}: Error finding container b99304f0d77de2753ee8376537960b6d127fa001c9025a9859cb22039ae8f185: Status 404 returned error can't find the container with id b99304f0d77de2753ee8376537960b6d127fa001c9025a9859cb22039ae8f185 Dec 02 07:25:34 crc kubenswrapper[4895]: W1202 07:25:34.800811 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1adeefcd_490c_4913_8315_baa7dbc1e7a9.slice/crio-afa32e77707ff8dbc0a3a6ca876066c0dac9e5d1e8c58df2650ed95b67c222f0 WatchSource:0}: Error finding container afa32e77707ff8dbc0a3a6ca876066c0dac9e5d1e8c58df2650ed95b67c222f0: Status 404 returned error can't find the container with id afa32e77707ff8dbc0a3a6ca876066c0dac9e5d1e8c58df2650ed95b67c222f0 Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.821094 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.853441 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.853968 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.353948062 +0000 UTC m=+146.524807675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.911658 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g"] Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.957527 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:34 crc kubenswrapper[4895]: E1202 07:25:34.958015 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.457992974 +0000 UTC m=+146.628852587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.972824 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5kk5x"] Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.984016 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf"] Dec 02 07:25:34 crc kubenswrapper[4895]: I1202 07:25:34.985494 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.014690 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.018176 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-svxr4"] Dec 02 07:25:35 crc kubenswrapper[4895]: W1202 07:25:35.022528 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911ed87d_b3fd_4002_b7b0_b720d7066459.slice/crio-c25abb12e7a9cd47705162230abc009014599619886064c3296b8d1efcd6d3df WatchSource:0}: Error finding container c25abb12e7a9cd47705162230abc009014599619886064c3296b8d1efcd6d3df: Status 404 returned error can't find the container with id c25abb12e7a9cd47705162230abc009014599619886064c3296b8d1efcd6d3df Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.059381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.059761 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.559725941 +0000 UTC m=+146.730585554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.121320 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.123475 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.124559 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mzn92"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.161820 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.162299 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.66227615 +0000 UTC m=+146.833135763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: W1202 07:25:35.213446 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf500c69a_f58d_4229_bcb5_6f6e5fbdeb3c.slice/crio-6321657b1014613bb1246936daf83d5c51f7bac16b4249fcf5327095068bad3f WatchSource:0}: Error finding container 6321657b1014613bb1246936daf83d5c51f7bac16b4249fcf5327095068bad3f: Status 404 returned error can't find the container with id 6321657b1014613bb1246936daf83d5c51f7bac16b4249fcf5327095068bad3f Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.225526 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.267110 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l82ft"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.271310 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.280621 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx"] Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.284317 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.784266687 +0000 UTC m=+146.955126300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.284377 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.284851 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.285434 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.785424949 +0000 UTC m=+146.956284562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.303524 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.306332 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zj6vq"] Dec 02 07:25:35 crc kubenswrapper[4895]: W1202 07:25:35.320967 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25eaa26b_3997_47d7_9932_8eff551bc799.slice/crio-7e6fc79df959b1c20afb50c493e1d2ef5cc6b509930284297145e9c119485dc1 WatchSource:0}: Error finding container 7e6fc79df959b1c20afb50c493e1d2ef5cc6b509930284297145e9c119485dc1: Status 404 returned error can't find the container with id 7e6fc79df959b1c20afb50c493e1d2ef5cc6b509930284297145e9c119485dc1 Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.328611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" event={"ID":"00f072cc-9501-499e-82b4-027e2d267930","Type":"ContainerStarted","Data":"b99304f0d77de2753ee8376537960b6d127fa001c9025a9859cb22039ae8f185"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.331176 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" event={"ID":"911ed87d-b3fd-4002-b7b0-b720d7066459","Type":"ContainerStarted","Data":"c25abb12e7a9cd47705162230abc009014599619886064c3296b8d1efcd6d3df"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.334427 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" event={"ID":"4fa0f113-d1df-4bc3-8f5a-b764f523272b","Type":"ContainerStarted","Data":"27ccace30515041b14067aa474f5c9a58d931d6324cd5f75772b56c41c62a7bc"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.364164 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6zlkc" podStartSLOduration=118.364142559 podStartE2EDuration="1m58.364142559s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.358223813 +0000 UTC m=+146.529083426" watchObservedRunningTime="2025-12-02 07:25:35.364142559 +0000 UTC m=+146.535002172" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.367831 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" event={"ID":"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4","Type":"ContainerStarted","Data":"c9c268f1dff1abe800d5521825d0263d7311b8979cf7cfd82e8da95fad04a5bb"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.381725 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" event={"ID":"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73","Type":"ContainerStarted","Data":"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.381799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" event={"ID":"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73","Type":"ContainerStarted","Data":"857cf1f2927f712ff263c79fc39238cee0fbc4256252c1abdfb94488b1ae30de"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.383165 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.386319 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.386356 4895 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-rd9pn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.386400 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.386681 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.886641811 +0000 UTC m=+147.057501424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.386798 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.387278 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.887263899 +0000 UTC m=+147.058123512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.401778 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" event={"ID":"c27769e4-b2f8-4947-96c9-b90bfce6ff0d","Type":"ContainerStarted","Data":"7a6f633113e9b5f8abc20c7c0f99453ea779c3d06a7879381db6d367dc58a7a1"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.407354 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" event={"ID":"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22","Type":"ContainerStarted","Data":"2d6edd7b5b965a7c79007bcc2f28cfd20001a39cfccdfc74b64c38644fcb62cc"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.407780 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" podStartSLOduration=118.407686282 podStartE2EDuration="1m58.407686282s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.402462965 +0000 UTC m=+146.573322578" watchObservedRunningTime="2025-12-02 07:25:35.407686282 +0000 UTC m=+146.578545915" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.465559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" event={"ID":"d210c48b-56d9-4385-86dc-da6b7a2cfbee","Type":"ContainerStarted","Data":"07dcd0e2801a115286800999edf7c23b95f128691b11d8c129815c68b248eb15"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.466040 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" event={"ID":"d210c48b-56d9-4385-86dc-da6b7a2cfbee","Type":"ContainerStarted","Data":"e8397964db1ef4acbbe3adcabc0318bb92a6b4d6f2837fb6dbc53fd8c1b003b3"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.482980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d7hb4" event={"ID":"0e8ba2f7-f07b-4532-9620-00662d37f5b9","Type":"ContainerStarted","Data":"e8ee7291fb471d2a215d76ca8e6e46fc357b3f0ab560d6d524026b7f960d0ce5"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.483943 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-bvgr6" podStartSLOduration=118.483913082 podStartE2EDuration="1m58.483913082s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.434615628 +0000 UTC m=+146.605475241" watchObservedRunningTime="2025-12-02 07:25:35.483913082 +0000 UTC m=+146.654772685" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.484582 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.487973 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.489508 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:35.989473218 +0000 UTC m=+147.160332831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.496213 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-d7hb4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.496287 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d7hb4" podUID="0e8ba2f7-f07b-4532-9620-00662d37f5b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.500406 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" event={"ID":"30d19465-967f-42ad-af2e-983465c989e1","Type":"ContainerStarted","Data":"2c658640b2d251195c7ce819058ea784964645f5f135cfb4b6dc7e707eb8620a"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.519398 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d7hb4" podStartSLOduration=118.519379699 podStartE2EDuration="1m58.519379699s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.51871227 +0000 UTC m=+146.689571883" watchObservedRunningTime="2025-12-02 07:25:35.519379699 +0000 UTC m=+146.690239312" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.524482 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q7mhm" podStartSLOduration=118.524463581 podStartE2EDuration="1m58.524463581s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.481050122 +0000 UTC m=+146.651909735" watchObservedRunningTime="2025-12-02 07:25:35.524463581 +0000 UTC m=+146.695323194" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.525121 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" event={"ID":"5338ef9d-1a37-4d19-8481-0e0a1de24df4","Type":"ContainerStarted","Data":"8377410f397e27161b9dfb6d2318fb79751c1ec7a47eca95efc721881aa84833"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.531351 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" event={"ID":"d87e36ab-5b0c-4726-9539-e3bf256e63bc","Type":"ContainerStarted","Data":"944f6481fc826d75c153d0458795eb003068c951c30573377f36734a22d455e2"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.553598 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7rlb" podStartSLOduration=118.553577728 podStartE2EDuration="1m58.553577728s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.553033584 +0000 UTC m=+146.723893187" watchObservedRunningTime="2025-12-02 07:25:35.553577728 +0000 UTC m=+146.724437341" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.576787 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" event={"ID":"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c","Type":"ContainerStarted","Data":"6321657b1014613bb1246936daf83d5c51f7bac16b4249fcf5327095068bad3f"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.578145 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:35 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:35 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:35 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.578633 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.582232 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" event={"ID":"19dd2c2d-fd12-4884-96d3-50ef117553c7","Type":"ContainerStarted","Data":"4a3d7681ff6bcca4469ae155c0b30158bc5aaae978d348ea952d396a8e2b9b4b"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.587512 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" event={"ID":"74128b07-b5b9-4646-8780-5d28b3a715ae","Type":"ContainerStarted","Data":"af67fa6f1c8e179533e70a329e70c5b1fa512fbe8c85941f23ee13d22af64e90"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.591298 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.593465 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.093444018 +0000 UTC m=+147.264303631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.594950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bpn7q" event={"ID":"acfc02d2-70d4-4e57-a457-509dc0c91437","Type":"ContainerStarted","Data":"dbb23b39389400cd5357401f9886c6bce3a27d3e128340f75898c0a4ecd84395"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.602619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" event={"ID":"eee3485d-8623-414b-8466-8da5e97c08b7","Type":"ContainerStarted","Data":"fd5a9052f1d657f68093a15d4a633dfad7e86ac4f771d91369d10fb9d27dee3c"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.610214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" event={"ID":"0554f1d4-d22d-47f4-9a38-5c2985fb0cc3","Type":"ContainerStarted","Data":"64096fdf0cb159032cca0309364970bda35e0cb4ecec7d3af5e31bfe7fac3abf"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.630973 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" event={"ID":"0752165f-320d-4555-8ac0-5cf99cd6194e","Type":"ContainerStarted","Data":"dce52e87e59278a541ef5530f24bed5f9d6f9afa0238cc4011a92628fb5aad70"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.646990 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bpn7q" podStartSLOduration=6.646963141 podStartE2EDuration="6.646963141s" podCreationTimestamp="2025-12-02 07:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.620117697 +0000 UTC m=+146.790977320" watchObservedRunningTime="2025-12-02 07:25:35.646963141 +0000 UTC m=+146.817822754" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.648217 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tcbw8" podStartSLOduration=118.648209936 podStartE2EDuration="1m58.648209936s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.647486846 +0000 UTC m=+146.818346469" watchObservedRunningTime="2025-12-02 07:25:35.648209936 +0000 UTC m=+146.819069549" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.670690 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" event={"ID":"b61a073f-daaf-4b24-8e0d-2d4937aaa601","Type":"ContainerStarted","Data":"b12fa5d441ae2aa8ea7529fe9040ad54e9a220932772e5757f754704a4f47db3"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.670760 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" event={"ID":"b61a073f-daaf-4b24-8e0d-2d4937aaa601","Type":"ContainerStarted","Data":"ba9da5ec6213e14c4462641b4b324ee7f46ee323e1b3897f4b79e00358ca06a3"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.692582 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.693877 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.193858197 +0000 UTC m=+147.364717810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.704654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" event={"ID":"1adeefcd-490c-4913-8315-baa7dbc1e7a9","Type":"ContainerStarted","Data":"afa32e77707ff8dbc0a3a6ca876066c0dac9e5d1e8c58df2650ed95b67c222f0"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.710553 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qp9gz" podStartSLOduration=118.710537206 podStartE2EDuration="1m58.710537206s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.708503699 +0000 UTC m=+146.879363312" watchObservedRunningTime="2025-12-02 07:25:35.710537206 +0000 UTC m=+146.881396819" Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.749623 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" event={"ID":"c48181d0-7322-418a-8f38-7e3450675f0e","Type":"ContainerStarted","Data":"8f5586b41430c167dc1a990b02c28029f1402871cca6002575519f8bec8c061c"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.797512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.798067 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.298031903 +0000 UTC m=+147.468891526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.844489 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" event={"ID":"1569ea0e-ca30-4212-95e4-11dde6bca970","Type":"ContainerStarted","Data":"39be8150baf3a4c7c1b68cec169fcca646a187079fd8062def76a42b615b14e8"} Dec 02 07:25:35 crc kubenswrapper[4895]: I1202 07:25:35.901324 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:35 crc kubenswrapper[4895]: E1202 07:25:35.901827 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.401807207 +0000 UTC m=+147.572666820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.003684 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.012877 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.512852095 +0000 UTC m=+147.683711768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.106416 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.106970 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.606929467 +0000 UTC m=+147.777789080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.215796 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.216376 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.71635472 +0000 UTC m=+147.887214333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.319150 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.319645 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.819609599 +0000 UTC m=+147.990469212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.421155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.422127 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:36.922108527 +0000 UTC m=+148.092968130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.525400 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.525864 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.02584191 +0000 UTC m=+148.196701523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.576810 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:36 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:36 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:36 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.577239 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.627537 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.627956 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.127939908 +0000 UTC m=+148.298799521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.728979 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.729405 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.229387296 +0000 UTC m=+148.400246909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.831132 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.831573 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.331558835 +0000 UTC m=+148.502418448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.876796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" event={"ID":"3fe4d45d-dc04-4b2b-ad77-e3f4dad62c22","Type":"ContainerStarted","Data":"4f61898aead111f27a257b6496a97392295a7b95c5f74814b1e6593268e9a377"} Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.878265 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.897754 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.933036 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:36 crc kubenswrapper[4895]: E1202 07:25:36.933450 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.433425916 +0000 UTC m=+148.604285529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.945633 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" event={"ID":"1adeefcd-490c-4913-8315-baa7dbc1e7a9","Type":"ContainerStarted","Data":"74f5fc16818348c26930385a8db0d71ea56ec440354160d2f376b748aa55b78e"} Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.946219 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.950862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" event={"ID":"1569ea0e-ca30-4212-95e4-11dde6bca970","Type":"ContainerStarted","Data":"d15450e14da46103e27c3d79a8b8ccd14599a88c652bc39509f9ea9c2b413802"} Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.964084 4895 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nv29v container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.964156 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.967104 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fsr4s" podStartSLOduration=119.967092211 podStartE2EDuration="1m59.967092211s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:36.96529565 +0000 UTC m=+148.136155263" watchObservedRunningTime="2025-12-02 07:25:36.967092211 +0000 UTC m=+148.137951824" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.967688 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-49t7q" podStartSLOduration=119.967683227 podStartE2EDuration="1m59.967683227s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:35.887168436 +0000 UTC m=+147.058028049" watchObservedRunningTime="2025-12-02 07:25:36.967683227 +0000 UTC m=+148.138542840" Dec 02 07:25:36 crc kubenswrapper[4895]: I1202 07:25:36.977954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" event={"ID":"911ed87d-b3fd-4002-b7b0-b720d7066459","Type":"ContainerStarted","Data":"0c72f8c0541c72a76ad92f86be9d67c63be11fc0fb3f928de10404b20a55b02e"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.009620 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" event={"ID":"f500c69a-f58d-4229-bcb5-6f6e5fbdeb3c","Type":"ContainerStarted","Data":"ae59230731d1a3b6cfa4d13243ff5c1f391ddc7001a89bc30bd395ec585d0aaf"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.038091 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.038522 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.538505296 +0000 UTC m=+148.709364909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.053583 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" podStartSLOduration=120.053559179 podStartE2EDuration="2m0.053559179s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.050651158 +0000 UTC m=+148.221510771" watchObservedRunningTime="2025-12-02 07:25:37.053559179 +0000 UTC m=+148.224418792" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.068943 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" event={"ID":"eee3485d-8623-414b-8466-8da5e97c08b7","Type":"ContainerStarted","Data":"12da696c3b3af13679d3d0d65ae21bffb5c85c71aa08d39c45c577d8c9a828a0"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.069094 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" event={"ID":"eee3485d-8623-414b-8466-8da5e97c08b7","Type":"ContainerStarted","Data":"59837a33d0a40d112007743d8ea8164a6fccd278bae561808071280f39fb2622"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.090546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" event={"ID":"00f072cc-9501-499e-82b4-027e2d267930","Type":"ContainerStarted","Data":"6e8fae784c10683dd18ce44a2c09932fda5ea3c82460f89669d99d207d53df89"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.090997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.092963 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" event={"ID":"d7b05beb-8c1c-4f69-bf13-199dbf869413","Type":"ContainerStarted","Data":"3b248be9313132311bf4b3ce3ccf569e16227f621c3f41a01560150e1559ce22"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.124946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q7mhm" event={"ID":"ddcbf4b8-5804-4136-8554-6a307825a6ed","Type":"ContainerStarted","Data":"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.128280 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lppp8" podStartSLOduration=120.128258537 podStartE2EDuration="2m0.128258537s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.11553044 +0000 UTC m=+148.286390063" watchObservedRunningTime="2025-12-02 07:25:37.128258537 +0000 UTC m=+148.299118150" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.134068 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" event={"ID":"74128b07-b5b9-4646-8780-5d28b3a715ae","Type":"ContainerStarted","Data":"9a555ab49bc28774a65ccba87799c22c5058c2aebdfb08f2cda162c364775809"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.140314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.142914 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.642890838 +0000 UTC m=+148.813750451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.153228 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" event={"ID":"7e8f451a-ac50-4e0a-bf8e-e6d505305177","Type":"ContainerStarted","Data":"9c4150dad8ac9bf277c6659f2996173dcea7cf6a3077852fb2e4dea67dec1703"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.153272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" event={"ID":"7e8f451a-ac50-4e0a-bf8e-e6d505305177","Type":"ContainerStarted","Data":"b75ca3f65e1f93151bf9f1190e69a8983ea1616225912abfc7a8367ab5913126"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.158235 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s9km8" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.158684 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d7hb4" event={"ID":"0e8ba2f7-f07b-4532-9620-00662d37f5b9","Type":"ContainerStarted","Data":"83e8732e70d08767d98e872e260b39e9e72e1aec596a75266a859dd4fb9c04f4"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.161210 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-d7hb4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.161246 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d7hb4" podUID="0e8ba2f7-f07b-4532-9620-00662d37f5b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.187372 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" event={"ID":"30d19465-967f-42ad-af2e-983465c989e1","Type":"ContainerStarted","Data":"2a306f64e786fd6ed5263ac161523f8308c0ddcee056409f139d443e639e19e0"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.228998 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dvkjf" podStartSLOduration=120.228980435 podStartE2EDuration="2m0.228980435s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.227478153 +0000 UTC m=+148.398337766" watchObservedRunningTime="2025-12-02 07:25:37.228980435 +0000 UTC m=+148.399840048" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.229213 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rcln9" podStartSLOduration=120.229208461 podStartE2EDuration="2m0.229208461s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.182786407 +0000 UTC m=+148.353646020" watchObservedRunningTime="2025-12-02 07:25:37.229208461 +0000 UTC m=+148.400068074" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.240947 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" event={"ID":"d87e36ab-5b0c-4726-9539-e3bf256e63bc","Type":"ContainerStarted","Data":"68e958f9a20fb2c492b48534a72c0b1c5899d5b2560ecc6f197ce26dc095b48c"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.241016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" event={"ID":"d87e36ab-5b0c-4726-9539-e3bf256e63bc","Type":"ContainerStarted","Data":"4d0a22fd3f84124b971196aa3fcd39160f89d9d837f1c1578dda344e02e1d7c7"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.250376 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.274664 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.774629717 +0000 UTC m=+148.945489330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.318200 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" podStartSLOduration=120.318170969 podStartE2EDuration="2m0.318170969s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.315852594 +0000 UTC m=+148.486712207" watchObservedRunningTime="2025-12-02 07:25:37.318170969 +0000 UTC m=+148.489030582" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.328890 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l82ft" event={"ID":"25eaa26b-3997-47d7-9932-8eff551bc799","Type":"ContainerStarted","Data":"7e6fc79df959b1c20afb50c493e1d2ef5cc6b509930284297145e9c119485dc1"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.345295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzn92" event={"ID":"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0","Type":"ContainerStarted","Data":"bb4a5e27135ad3ed3e90ba61c00b647b42f85ae2abaf6b890a68f34176d21394"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.345392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzn92" event={"ID":"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0","Type":"ContainerStarted","Data":"dc6a40840e0c4dc70b0cfbdfeca0855e6aa19329aff6f58955f8c786cef23774"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.363983 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-svxr4" podStartSLOduration=119.363960025 podStartE2EDuration="1m59.363960025s" podCreationTimestamp="2025-12-02 07:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.360884789 +0000 UTC m=+148.531744402" watchObservedRunningTime="2025-12-02 07:25:37.363960025 +0000 UTC m=+148.534819638" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.365676 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.366174 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.866147477 +0000 UTC m=+149.037007100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.394255 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" event={"ID":"80353fb9-e36f-4a78-a2e5-8c11d25a94f2","Type":"ContainerStarted","Data":"9bec77d44e44fcee7314d4c526bb73c3b14f5706a5af60e72f6c410ce4cade68"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.406202 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.419330 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.425295 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" podStartSLOduration=120.425271737 podStartE2EDuration="2m0.425271737s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.424636689 +0000 UTC m=+148.595496302" watchObservedRunningTime="2025-12-02 07:25:37.425271737 +0000 UTC m=+148.596131350" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.425379 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.433589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" event={"ID":"19dd2c2d-fd12-4884-96d3-50ef117553c7","Type":"ContainerStarted","Data":"03b05fcaa5befd622fff7a068cbb179eb6c1aca4fd38500d3c1fc8f908a9d7f9"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.433860 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" event={"ID":"19dd2c2d-fd12-4884-96d3-50ef117553c7","Type":"ContainerStarted","Data":"38e35761bc652dcc53c17eafb17baa3e9587c5565d9d11ae59ddd47594e390c6"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.433883 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.435349 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.458596 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" event={"ID":"85410206-3fcb-46c1-ac5d-bc3100b30544","Type":"ContainerStarted","Data":"b8548102435620be49ebc3c813247d6b54762d44776eed4723a520a3bbe65811"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.469720 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.469798 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.469828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kglmv\" (UniqueName: \"kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.470172 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:37.970156537 +0000 UTC m=+149.141016150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.470276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.507050 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" event={"ID":"349b3a38-fe58-4c38-8008-ac5ba643ddef","Type":"ContainerStarted","Data":"303164eba9bfc6f614cdb8a7d35dc6b6c1fa6a8e0ef3db8d471347e545765755"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.507098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" event={"ID":"349b3a38-fe58-4c38-8008-ac5ba643ddef","Type":"ContainerStarted","Data":"a0ad818ed411f8f590a5690af431528931c155299fa1b18f8063307e005d4d91"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.507112 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.535114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" event={"ID":"cd87ba31-a340-47a1-a2db-3019015a2c24","Type":"ContainerStarted","Data":"2ef662fadfd5530ef6b78fecbb3d870bec74bcf25aa4f5e4da0837105666390f"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.545692 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6g8m8" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.547824 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" podStartSLOduration=120.547809488 podStartE2EDuration="2m0.547809488s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.498106152 +0000 UTC m=+148.668965795" watchObservedRunningTime="2025-12-02 07:25:37.547809488 +0000 UTC m=+148.718669111" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.565919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" event={"ID":"c48181d0-7322-418a-8f38-7e3450675f0e","Type":"ContainerStarted","Data":"1f364cc7ac17288b8c12409a257fa73f11a9612bca352fb0ba07ee3ef67fdef1"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.574887 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.575120 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.575159 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kglmv\" (UniqueName: \"kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.575349 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.576265 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.076246156 +0000 UTC m=+149.247105769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.576896 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.580589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.588400 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:37 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:37 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:37 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.588457 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.595837 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" event={"ID":"168c3e01-42d8-4684-b160-23c5eb559a98","Type":"ContainerStarted","Data":"edaec2c3cbdd2de4140584dbdf7ded280c2442c975d5567b8353f5612c29bdef"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.604110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" event={"ID":"f7d85ea2-5fcb-4744-bd7b-fa309a774ab4","Type":"ContainerStarted","Data":"cbc401e26ceb4e4aca1af2e6523c87b953bf7ec25cab8f070f9104c2a5660609"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.605435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.620841 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.637444 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" event={"ID":"4a9d5b86-ddba-433a-91c3-efe2043f66e3","Type":"ContainerStarted","Data":"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.637510 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" event={"ID":"4a9d5b86-ddba-433a-91c3-efe2043f66e3","Type":"ContainerStarted","Data":"0f02bb18c7138d49b9d51d5ef29c16ef0ce4011c6071536ad55b8702620860ff"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.638730 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.645931 4895 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zkq6j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.645995 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.670572 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" event={"ID":"d210c48b-56d9-4385-86dc-da6b7a2cfbee","Type":"ContainerStarted","Data":"50f5793cdfd498282d921a26f90fc06592acf350b31e369a2027011228280281"} Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.678267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.682250 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7d79x" podStartSLOduration=120.682225562 podStartE2EDuration="2m0.682225562s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.658428544 +0000 UTC m=+148.829288177" watchObservedRunningTime="2025-12-02 07:25:37.682225562 +0000 UTC m=+148.853085175" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.684137 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.687192 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.187171741 +0000 UTC m=+149.358031364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.687465 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.691816 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.703478 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kglmv\" (UniqueName: \"kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv\") pod \"community-operators-5q5hv\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.721450 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.733045 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.785457 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.786036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7md7f\" (UniqueName: \"kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.786166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.786228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.787338 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.287319324 +0000 UTC m=+149.458178937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.807113 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.819440 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.821087 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.866806 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887677 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887789 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887851 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7md7f\" (UniqueName: \"kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.887923 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdc82\" (UniqueName: \"kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.888476 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.888766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.889139 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.389122052 +0000 UTC m=+149.559981665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.966859 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7md7f\" (UniqueName: \"kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f\") pod \"certified-operators-zbpzj\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.994685 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.995052 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.995107 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.995136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdc82\" (UniqueName: \"kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: E1202 07:25:37.995536 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.495518399 +0000 UTC m=+149.666378012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.995915 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.996148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.997904 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" podStartSLOduration=120.997885956 podStartE2EDuration="2m0.997885956s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:37.977118723 +0000 UTC m=+149.147978336" watchObservedRunningTime="2025-12-02 07:25:37.997885956 +0000 UTC m=+149.168745559" Dec 02 07:25:37 crc kubenswrapper[4895]: I1202 07:25:37.998510 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.007374 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.047278 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdc82\" (UniqueName: \"kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82\") pod \"community-operators-899nx\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.059026 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.076645 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l82ft" podStartSLOduration=9.076618397 podStartE2EDuration="9.076618397s" podCreationTimestamp="2025-12-02 07:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.040074881 +0000 UTC m=+149.210934504" watchObservedRunningTime="2025-12-02 07:25:38.076618397 +0000 UTC m=+149.247478010" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.083387 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.100584 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnvj2\" (UniqueName: \"kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.100970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.100996 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.101038 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.123975 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.623952366 +0000 UTC m=+149.794811979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.125107 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2tvrt" podStartSLOduration=121.125084378 podStartE2EDuration="2m1.125084378s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.12443478 +0000 UTC m=+149.295294393" watchObservedRunningTime="2025-12-02 07:25:38.125084378 +0000 UTC m=+149.295943991" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.200955 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202011 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202215 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnvj2\" (UniqueName: \"kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202256 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.202344 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.702309656 +0000 UTC m=+149.873169269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202427 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202477 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.202600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.203068 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.203458 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.703449469 +0000 UTC m=+149.874309082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.220007 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" podStartSLOduration=121.219980883 podStartE2EDuration="2m1.219980883s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.201912605 +0000 UTC m=+149.372772218" watchObservedRunningTime="2025-12-02 07:25:38.219980883 +0000 UTC m=+149.390840496" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.262797 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnvj2\" (UniqueName: \"kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2\") pod \"certified-operators-qsldg\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.270165 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" podStartSLOduration=121.270139091 podStartE2EDuration="2m1.270139091s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.26297442 +0000 UTC m=+149.433834033" watchObservedRunningTime="2025-12-02 07:25:38.270139091 +0000 UTC m=+149.440998704" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.311878 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" podStartSLOduration=121.311860463 podStartE2EDuration="2m1.311860463s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.311193734 +0000 UTC m=+149.482053337" watchObservedRunningTime="2025-12-02 07:25:38.311860463 +0000 UTC m=+149.482720076" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.313822 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.314015 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.813993572 +0000 UTC m=+149.984853195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.314044 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.324407 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.824366244 +0000 UTC m=+149.995225857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.344272 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.365814 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xhq8l" podStartSLOduration=121.365797037 podStartE2EDuration="2m1.365797037s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.365285213 +0000 UTC m=+149.536144846" watchObservedRunningTime="2025-12-02 07:25:38.365797037 +0000 UTC m=+149.536656660" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.416604 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.417097 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:38.917074778 +0000 UTC m=+150.087934401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.495218 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" podStartSLOduration=121.495194081 podStartE2EDuration="2m1.495194081s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.494338187 +0000 UTC m=+149.665197810" watchObservedRunningTime="2025-12-02 07:25:38.495194081 +0000 UTC m=+149.666053694" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.507179 4895 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vs2rc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.507253 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" podUID="349b3a38-fe58-4c38-8008-ac5ba643ddef" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.518841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.519328 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.019312828 +0000 UTC m=+150.190172441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.593152 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:38 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:38 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:38 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.593210 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.609441 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ld28g" podStartSLOduration=121.609422029 podStartE2EDuration="2m1.609422029s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.559880318 +0000 UTC m=+149.730739951" watchObservedRunningTime="2025-12-02 07:25:38.609422029 +0000 UTC m=+149.780281652" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.628414 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.628683 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.128646078 +0000 UTC m=+150.299505691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.628875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.629879 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.129857303 +0000 UTC m=+150.300716916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.703170 4895 generic.go:334] "Generic (PLEG): container finished" podID="d7b05beb-8c1c-4f69-bf13-199dbf869413" containerID="c85cdfc9b5e08ff322e825f663b168d9b69c4e1c2d142e698c50a52aa78e2e4f" exitCode=0 Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.703528 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" event={"ID":"d7b05beb-8c1c-4f69-bf13-199dbf869413","Type":"ContainerDied","Data":"c85cdfc9b5e08ff322e825f663b168d9b69c4e1c2d142e698c50a52aa78e2e4f"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.715390 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" event={"ID":"0752165f-320d-4555-8ac0-5cf99cd6194e","Type":"ContainerStarted","Data":"410277d5bbcd1f6ce3477f55b36d4c61d662aea9b3357ec2ef284d8f019380d9"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.731513 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.731984 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.231965339 +0000 UTC m=+150.402824952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.750847 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" event={"ID":"80353fb9-e36f-4a78-a2e5-8c11d25a94f2","Type":"ContainerStarted","Data":"6bffd3e256fbdddf453eadf488767d442cb53973799697b680b735e9702613cf"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.750922 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" event={"ID":"80353fb9-e36f-4a78-a2e5-8c11d25a94f2","Type":"ContainerStarted","Data":"904120324f67fbaec54911691bc892fa3a5ef2a42472404e1dc3bbe991be594b"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.783246 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.792166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" event={"ID":"168c3e01-42d8-4684-b160-23c5eb559a98","Type":"ContainerStarted","Data":"b9f6945c25180708f2c185c9185ed7280ce0d2c54b1ad181ba6647d3b3590ec8"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.792231 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" event={"ID":"168c3e01-42d8-4684-b160-23c5eb559a98","Type":"ContainerStarted","Data":"e332eecfdc168487584e28ad3e175e7c1273060317702da77565f1f543910a25"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.803732 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4n5nx" podStartSLOduration=121.803714564 podStartE2EDuration="2m1.803714564s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.80354231 +0000 UTC m=+149.974401923" watchObservedRunningTime="2025-12-02 07:25:38.803714564 +0000 UTC m=+149.974574177" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.814623 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9mn97" event={"ID":"85410206-3fcb-46c1-ac5d-bc3100b30544","Type":"ContainerStarted","Data":"d0ec2c75a681665c8bf1808a4e54f68278506e8e579bc44d917d2259ac7d8a2b"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.835894 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.843650 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zj6vq" podStartSLOduration=121.843630735 podStartE2EDuration="2m1.843630735s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:38.841226348 +0000 UTC m=+150.012085961" watchObservedRunningTime="2025-12-02 07:25:38.843630735 +0000 UTC m=+150.014490348" Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.845066 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.345051045 +0000 UTC m=+150.515910658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.850897 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fz6sl" event={"ID":"30d19465-967f-42ad-af2e-983465c989e1","Type":"ContainerStarted","Data":"6b076da48e0e1ab0e3966c42a30b46425b4933485399ba5dda906c1f808e41f5"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.940831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:38 crc kubenswrapper[4895]: E1202 07:25:38.942907 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.442877292 +0000 UTC m=+150.613736905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.961158 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l82ft" event={"ID":"25eaa26b-3997-47d7-9932-8eff551bc799","Type":"ContainerStarted","Data":"897054b8738aefed8632f5d3d4a697a94e79387dcc85b0a495c46c8682d9d79f"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.980832 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mzn92" event={"ID":"8eae663a-aaa8-488b-b46c-3ce28f7e0bb0","Type":"ContainerStarted","Data":"3ac5682eb1e4deec869367fb68cceaa3f8b826ed5a4ce921f6615b8c01601220"} Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.992676 4895 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zkq6j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.992757 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.993122 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-d7hb4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.993142 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d7hb4" podUID="0e8ba2f7-f07b-4532-9620-00662d37f5b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:25:38 crc kubenswrapper[4895]: I1202 07:25:38.994350 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.022421 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.040505 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vs2rc" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.046275 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:39 crc kubenswrapper[4895]: E1202 07:25:39.064342 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.564313922 +0000 UTC m=+150.735173535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.096466 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mzn92" podStartSLOduration=10.096442234 podStartE2EDuration="10.096442234s" podCreationTimestamp="2025-12-02 07:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:39.022687343 +0000 UTC m=+150.193546966" watchObservedRunningTime="2025-12-02 07:25:39.096442234 +0000 UTC m=+150.267301847" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.164416 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:39 crc kubenswrapper[4895]: E1202 07:25:39.164921 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.664897286 +0000 UTC m=+150.835756899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.173669 4895 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.260529 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.268653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:39 crc kubenswrapper[4895]: E1202 07:25:39.277386 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.777360454 +0000 UTC m=+150.948220067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tjqt9" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:39 crc kubenswrapper[4895]: W1202 07:25:39.335054 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ad8c7c3_c1ac_498b_b1b5_76f1db5c1c19.slice/crio-df811010396c482c331a4259a468db1921ebdbaebe8d78610bc1439a28276495 WatchSource:0}: Error finding container df811010396c482c331a4259a468db1921ebdbaebe8d78610bc1439a28276495: Status 404 returned error can't find the container with id df811010396c482c331a4259a468db1921ebdbaebe8d78610bc1439a28276495 Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.366200 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.372830 4895 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T07:25:39.1739349Z","Handler":null,"Name":""} Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.390365 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:39 crc kubenswrapper[4895]: E1202 07:25:39.390909 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:25:39.890892263 +0000 UTC m=+151.061751876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.403226 4895 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.403254 4895 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.452595 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.475682 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.492526 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.528185 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.528228 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.583444 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:39 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:39 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:39 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.583512 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.636550 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.645330 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.656191 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.671640 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.712568 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wcp\" (UniqueName: \"kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.712624 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.712694 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.814692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.825121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wcp\" (UniqueName: \"kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.825212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.825720 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.815135 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.857687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wcp\" (UniqueName: \"kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp\") pod \"redhat-marketplace-ptmfz\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.968144 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tjqt9\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.979827 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:25:39 crc kubenswrapper[4895]: I1202 07:25:39.980912 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.001207 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.011906 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.019923 4895 generic.go:334] "Generic (PLEG): container finished" podID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerID="19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836" exitCode=0 Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.020005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerDied","Data":"19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.020041 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerStarted","Data":"df811010396c482c331a4259a468db1921ebdbaebe8d78610bc1439a28276495"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.028566 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.030394 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.030722 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gwr\" (UniqueName: \"kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.030773 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.030798 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.056110 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerID="6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718" exitCode=0 Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.056222 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerDied","Data":"6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.056245 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerStarted","Data":"198c250052edd6dd7f2c8faa5b781b549698d366753f6e900814a63f59d71f20"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.114349 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" event={"ID":"d7b05beb-8c1c-4f69-bf13-199dbf869413","Type":"ContainerStarted","Data":"318fc22eba0049baeb3773bc2d246a4dee8e00060153afd9f3349e3ff84dbe3a"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.134020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gwr\" (UniqueName: \"kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.134069 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.134108 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.136258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.137178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.158124 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 07:25:40 crc kubenswrapper[4895]: E1202 07:25:40.187603 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04158de_6693_44c6_82d3_198e545eccfb.slice/crio-conmon-3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6040fed_14e6_49a5_802b_e49bfeba7aa5.slice/crio-e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c.scope\": RecentStats: unable to find data in memory cache]" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.211123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" event={"ID":"0752165f-320d-4555-8ac0-5cf99cd6194e","Type":"ContainerStarted","Data":"f8df12f1f94803c387bbdcd6cb65dfa087be0af12912136f09b7f81aa026dab6"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.211182 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" event={"ID":"0752165f-320d-4555-8ac0-5cf99cd6194e","Type":"ContainerStarted","Data":"0538315ebf475b78566d4d3316eb10c0924468378583799c4ff06b86a52962de"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.220182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gwr\" (UniqueName: \"kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr\") pod \"redhat-marketplace-l7hjt\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.228791 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerID="e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c" exitCode=0 Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.228920 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerDied","Data":"e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.228952 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerStarted","Data":"088385ce913bf789e7bb655af7c4729ddc13523d74df1332a7e7007a588752bd"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.241287 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerStarted","Data":"0a3cd8cd6db0b54a499191102c15267ae4ad43caec27eae97d24967d297ad308"} Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.252022 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.262208 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.270264 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" podStartSLOduration=123.270240766 podStartE2EDuration="2m3.270240766s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:40.161467931 +0000 UTC m=+151.332327544" watchObservedRunningTime="2025-12-02 07:25:40.270240766 +0000 UTC m=+151.441100389" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.393020 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.394829 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.397133 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.398515 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.403807 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.413640 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.447235 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.449346 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.551095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.551212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.551725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.570403 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.571855 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.577034 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.590312 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:40 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:40 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:40 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.596315 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.599386 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.625141 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.652584 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.652665 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjgv\" (UniqueName: \"kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.652712 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.654927 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.756238 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.756520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.756594 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjgv\" (UniqueName: \"kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.758224 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.760448 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.776962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.779512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjgv\" (UniqueName: \"kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv\") pod \"redhat-operators-wjh4k\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.813920 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:25:40 crc kubenswrapper[4895]: W1202 07:25:40.870943 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod873d08e2_d2e6_4785_b807_3b50d758d136.slice/crio-7344ada898fe7c120a55256d02448c442c26f0d825b6ce7007290f712abe6389 WatchSource:0}: Error finding container 7344ada898fe7c120a55256d02448c442c26f0d825b6ce7007290f712abe6389: Status 404 returned error can't find the container with id 7344ada898fe7c120a55256d02448c442c26f0d825b6ce7007290f712abe6389 Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.932621 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.939362 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.955574 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.957049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:40 crc kubenswrapper[4895]: I1202 07:25:40.976034 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.070623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.071159 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.071204 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.071231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.071298 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pp7s\" (UniqueName: \"kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.073598 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.081721 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.100051 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.160491 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.172864 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.172941 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.172973 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pp7s\" (UniqueName: \"kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.173004 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.173049 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.173589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.174176 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.188369 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.188407 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.188858 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.193781 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.193780 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pp7s\" (UniqueName: \"kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s\") pod \"redhat-operators-5c5n7\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.202953 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.260649 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.266107 4895 generic.go:334] "Generic (PLEG): container finished" podID="c04158de-6693-44c6-82d3-198e545eccfb" containerID="3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29" exitCode=0 Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.266204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerDied","Data":"3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.267571 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" event={"ID":"4f1d0fc5-528f-4529-938f-7041be573fa7","Type":"ContainerStarted","Data":"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.267638 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" event={"ID":"4f1d0fc5-528f-4529-938f-7041be573fa7","Type":"ContainerStarted","Data":"14e3a3c4df1f81bc8db888569aed2e7b6ff37792719ec6f25fdbd9597470336f"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.267860 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.277525 4895 generic.go:334] "Generic (PLEG): container finished" podID="7e8f451a-ac50-4e0a-bf8e-e6d505305177" containerID="9c4150dad8ac9bf277c6659f2996173dcea7cf6a3077852fb2e4dea67dec1703" exitCode=0 Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.277615 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" event={"ID":"7e8f451a-ac50-4e0a-bf8e-e6d505305177","Type":"ContainerDied","Data":"9c4150dad8ac9bf277c6659f2996173dcea7cf6a3077852fb2e4dea67dec1703"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.279487 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08c2b56f-dd15-4d5f-844f-fd6571a23f7f","Type":"ContainerStarted","Data":"93ea1a5842a89b7f6f675e65b1e8dbd2812bf7607a6a52aff8fa238245297270"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.282467 4895 generic.go:334] "Generic (PLEG): container finished" podID="873d08e2-d2e6-4785-b807-3b50d758d136" containerID="2dc57c9ac27f3fadacbb0fd0e7ca45a937372ef220adc6d40703b2fe9b58b2f4" exitCode=0 Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.282551 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerDied","Data":"2dc57c9ac27f3fadacbb0fd0e7ca45a937372ef220adc6d40703b2fe9b58b2f4"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.282585 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerStarted","Data":"7344ada898fe7c120a55256d02448c442c26f0d825b6ce7007290f712abe6389"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.291055 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" podStartSLOduration=124.291015029 podStartE2EDuration="2m4.291015029s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:41.290770342 +0000 UTC m=+152.461629975" watchObservedRunningTime="2025-12-02 07:25:41.291015029 +0000 UTC m=+152.461874642" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.302768 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.315039 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" event={"ID":"0752165f-320d-4555-8ac0-5cf99cd6194e","Type":"ContainerStarted","Data":"6f27077bd964edbe64308d5a8a3ec490b2cfdd70acc83265d0e90e1ffe7d683e"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.318087 4895 generic.go:334] "Generic (PLEG): container finished" podID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerID="97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171" exitCode=0 Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.318230 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerDied","Data":"97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.318249 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerStarted","Data":"894dd4fb1df051d5da966aa1c44069c1e2e167bb80e259dd72f67ea0c45472a0"} Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.322622 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jfs6x" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.370150 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5kk5x" podStartSLOduration=12.370128421 podStartE2EDuration="12.370128421s" podCreationTimestamp="2025-12-02 07:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:25:41.367853947 +0000 UTC m=+152.538713580" watchObservedRunningTime="2025-12-02 07:25:41.370128421 +0000 UTC m=+152.540988034" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.371531 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.396375 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.579635 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:41 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:41 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:41 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.579944 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.587756 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:25:41 crc kubenswrapper[4895]: I1202 07:25:41.880968 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.396879 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08c2b56f-dd15-4d5f-844f-fd6571a23f7f","Type":"ContainerStarted","Data":"e20ae7297ab44905464dcc760f79ad1a131f5b4f8e0a816af9387bd200cad29b"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.402071 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.402151 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.405027 4895 patch_prober.go:28] interesting pod/console-f9d7485db-q7mhm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.405193 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q7mhm" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.413699 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1b045dd16ca164b7d6025127768cdc21bd5c1fc2e75f5d1910d60b8a290665d7"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.418200 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerID="cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9" exitCode=0 Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.418387 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerDied","Data":"cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.418415 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerStarted","Data":"9b66a747cc6eed02c70aae485088aad215a0743f0a1e9d014c2b7880dba8944b"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.453133 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7771b97a0d10f40e308fa5785be014b0d42c49461b32873d502a8f112811653e"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.461954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0c68b6db0f7b13bf013b10e967ed73dfdd247c90e2a927429c1933225e7e6d9d"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.468557 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerStarted","Data":"46abadcc64129015fd5c7420ee2ec9a3f9250289950d5b5c2c2d6bdc5539373f"} Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.490510 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-d7hb4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.490575 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d7hb4" podUID="0e8ba2f7-f07b-4532-9620-00662d37f5b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.490596 4895 patch_prober.go:28] interesting pod/downloads-7954f5f757-d7hb4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.490659 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d7hb4" podUID="0e8ba2f7-f07b-4532-9620-00662d37f5b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.575426 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.581016 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:42 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:42 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:42 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.581097 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.813452 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.918682 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume\") pod \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.918764 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgr4r\" (UniqueName: \"kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r\") pod \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.918795 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume\") pod \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\" (UID: \"7e8f451a-ac50-4e0a-bf8e-e6d505305177\") " Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.920385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e8f451a-ac50-4e0a-bf8e-e6d505305177" (UID: "7e8f451a-ac50-4e0a-bf8e-e6d505305177"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.927214 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r" (OuterVolumeSpecName: "kube-api-access-tgr4r") pod "7e8f451a-ac50-4e0a-bf8e-e6d505305177" (UID: "7e8f451a-ac50-4e0a-bf8e-e6d505305177"). InnerVolumeSpecName "kube-api-access-tgr4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.927765 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e8f451a-ac50-4e0a-bf8e-e6d505305177" (UID: "7e8f451a-ac50-4e0a-bf8e-e6d505305177"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.992073 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:42 crc kubenswrapper[4895]: I1202 07:25:42.992128 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.001221 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.020436 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e8f451a-ac50-4e0a-bf8e-e6d505305177-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.020529 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgr4r\" (UniqueName: \"kubernetes.io/projected/7e8f451a-ac50-4e0a-bf8e-e6d505305177-kube-api-access-tgr4r\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.020540 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e8f451a-ac50-4e0a-bf8e-e6d505305177-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.480307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b816373f3c208c458c5c6a98550655db4ac2e2a640d2b0abc0f83a0e41c37882"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.481631 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.491642 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" event={"ID":"7e8f451a-ac50-4e0a-bf8e-e6d505305177","Type":"ContainerDied","Data":"b75ca3f65e1f93151bf9f1190e69a8983ea1616225912abfc7a8367ab5913126"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.491698 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b75ca3f65e1f93151bf9f1190e69a8983ea1616225912abfc7a8367ab5913126" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.492483 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.514954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"388c4c1e644a8cb6189d78a3cdf0b75b94c8d13dd2bfcb4c205f70fd9af4a03b"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.549689 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"427afba3a34d4cadfb9af6d7449f93ba4ac61bb8c07929f22330e949c1c34430"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.554560 4895 generic.go:334] "Generic (PLEG): container finished" podID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerID="26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e" exitCode=0 Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.554657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerDied","Data":"26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.570674 4895 generic.go:334] "Generic (PLEG): container finished" podID="08c2b56f-dd15-4d5f-844f-fd6571a23f7f" containerID="e20ae7297ab44905464dcc760f79ad1a131f5b4f8e0a816af9387bd200cad29b" exitCode=0 Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.572317 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08c2b56f-dd15-4d5f-844f-fd6571a23f7f","Type":"ContainerDied","Data":"e20ae7297ab44905464dcc760f79ad1a131f5b4f8e0a816af9387bd200cad29b"} Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.577976 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:43 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:43 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:43 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.578455 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:43 crc kubenswrapper[4895]: I1202 07:25:43.580227 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wdmt6" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.097358 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.188320 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access\") pod \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.188383 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir\") pod \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\" (UID: \"08c2b56f-dd15-4d5f-844f-fd6571a23f7f\") " Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.188578 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "08c2b56f-dd15-4d5f-844f-fd6571a23f7f" (UID: "08c2b56f-dd15-4d5f-844f-fd6571a23f7f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.188857 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.198961 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "08c2b56f-dd15-4d5f-844f-fd6571a23f7f" (UID: "08c2b56f-dd15-4d5f-844f-fd6571a23f7f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.291465 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08c2b56f-dd15-4d5f-844f-fd6571a23f7f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.587330 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:44 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:44 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:44 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.587547 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.617912 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.617967 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08c2b56f-dd15-4d5f-844f-fd6571a23f7f","Type":"ContainerDied","Data":"93ea1a5842a89b7f6f675e65b1e8dbd2812bf7607a6a52aff8fa238245297270"} Dec 02 07:25:44 crc kubenswrapper[4895]: I1202 07:25:44.617996 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ea1a5842a89b7f6f675e65b1e8dbd2812bf7607a6a52aff8fa238245297270" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.576999 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:45 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:45 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:45 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.577683 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.979810 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:25:45 crc kubenswrapper[4895]: E1202 07:25:45.980079 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8f451a-ac50-4e0a-bf8e-e6d505305177" containerName="collect-profiles" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.980146 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8f451a-ac50-4e0a-bf8e-e6d505305177" containerName="collect-profiles" Dec 02 07:25:45 crc kubenswrapper[4895]: E1202 07:25:45.980158 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c2b56f-dd15-4d5f-844f-fd6571a23f7f" containerName="pruner" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.980164 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c2b56f-dd15-4d5f-844f-fd6571a23f7f" containerName="pruner" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.980549 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8f451a-ac50-4e0a-bf8e-e6d505305177" containerName="collect-profiles" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.980596 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c2b56f-dd15-4d5f-844f-fd6571a23f7f" containerName="pruner" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.981978 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.988191 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.988625 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:25:45 crc kubenswrapper[4895]: I1202 07:25:45.989477 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.027188 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.027372 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.129191 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.129388 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.129538 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.160347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.311578 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.583398 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:46 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:46 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:46 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.583900 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:46 crc kubenswrapper[4895]: I1202 07:25:46.959540 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:25:46 crc kubenswrapper[4895]: W1202 07:25:46.978432 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18283c7d_73dc_4e88_99ba_af7557b0d317.slice/crio-327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84 WatchSource:0}: Error finding container 327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84: Status 404 returned error can't find the container with id 327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84 Dec 02 07:25:47 crc kubenswrapper[4895]: I1202 07:25:47.575794 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:47 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:47 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:47 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:47 crc kubenswrapper[4895]: I1202 07:25:47.576187 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:47 crc kubenswrapper[4895]: I1202 07:25:47.677206 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"18283c7d-73dc-4e88-99ba-af7557b0d317","Type":"ContainerStarted","Data":"327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84"} Dec 02 07:25:47 crc kubenswrapper[4895]: I1202 07:25:47.718259 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mzn92" Dec 02 07:25:48 crc kubenswrapper[4895]: I1202 07:25:48.575810 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:48 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:48 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:48 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:48 crc kubenswrapper[4895]: I1202 07:25:48.575889 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:48 crc kubenswrapper[4895]: I1202 07:25:48.686529 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"18283c7d-73dc-4e88-99ba-af7557b0d317","Type":"ContainerStarted","Data":"d251bd59746e6604c97fb5412c8cf10776c4133e390b6534400e2268290f355a"} Dec 02 07:25:49 crc kubenswrapper[4895]: I1202 07:25:49.575161 4895 patch_prober.go:28] interesting pod/router-default-5444994796-lf9fx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:25:49 crc kubenswrapper[4895]: [-]has-synced failed: reason withheld Dec 02 07:25:49 crc kubenswrapper[4895]: [+]process-running ok Dec 02 07:25:49 crc kubenswrapper[4895]: healthz check failed Dec 02 07:25:49 crc kubenswrapper[4895]: I1202 07:25:49.575544 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lf9fx" podUID="02423153-ef82-4284-b703-cf006e0b8b70" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:25:49 crc kubenswrapper[4895]: I1202 07:25:49.703378 4895 generic.go:334] "Generic (PLEG): container finished" podID="18283c7d-73dc-4e88-99ba-af7557b0d317" containerID="d251bd59746e6604c97fb5412c8cf10776c4133e390b6534400e2268290f355a" exitCode=0 Dec 02 07:25:49 crc kubenswrapper[4895]: I1202 07:25:49.703438 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"18283c7d-73dc-4e88-99ba-af7557b0d317","Type":"ContainerDied","Data":"d251bd59746e6604c97fb5412c8cf10776c4133e390b6534400e2268290f355a"} Dec 02 07:25:50 crc kubenswrapper[4895]: I1202 07:25:50.576402 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:50 crc kubenswrapper[4895]: I1202 07:25:50.580635 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lf9fx" Dec 02 07:25:52 crc kubenswrapper[4895]: I1202 07:25:52.492973 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d7hb4" Dec 02 07:25:52 crc kubenswrapper[4895]: I1202 07:25:52.562221 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:52 crc kubenswrapper[4895]: I1202 07:25:52.566891 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.312347 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.419156 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir\") pod \"18283c7d-73dc-4e88-99ba-af7557b0d317\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.419369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access\") pod \"18283c7d-73dc-4e88-99ba-af7557b0d317\" (UID: \"18283c7d-73dc-4e88-99ba-af7557b0d317\") " Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.419428 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "18283c7d-73dc-4e88-99ba-af7557b0d317" (UID: "18283c7d-73dc-4e88-99ba-af7557b0d317"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.419977 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18283c7d-73dc-4e88-99ba-af7557b0d317-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.428924 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "18283c7d-73dc-4e88-99ba-af7557b0d317" (UID: "18283c7d-73dc-4e88-99ba-af7557b0d317"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.521673 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18283c7d-73dc-4e88-99ba-af7557b0d317-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.799582 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"18283c7d-73dc-4e88-99ba-af7557b0d317","Type":"ContainerDied","Data":"327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84"} Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.799646 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="327ac5d927cd58b2488f244b073db678cb0460bd249b08212b77976f2789ff84" Dec 02 07:25:56 crc kubenswrapper[4895]: I1202 07:25:56.799673 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:26:00 crc kubenswrapper[4895]: I1202 07:26:00.269010 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:26:00 crc kubenswrapper[4895]: I1202 07:26:00.269346 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:26:00 crc kubenswrapper[4895]: I1202 07:26:00.289364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af25091-1401-45d4-ae53-d2b469c879da-metrics-certs\") pod \"network-metrics-daemon-5f88v\" (UID: \"5af25091-1401-45d4-ae53-d2b469c879da\") " pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:26:00 crc kubenswrapper[4895]: I1202 07:26:00.453645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5f88v" Dec 02 07:26:05 crc kubenswrapper[4895]: I1202 07:26:05.473594 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:26:05 crc kubenswrapper[4895]: I1202 07:26:05.474183 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.852493 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5f88v"] Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.894886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerStarted","Data":"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.897992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerStarted","Data":"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.901399 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerStarted","Data":"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.903860 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerStarted","Data":"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.906203 4895 generic.go:334] "Generic (PLEG): container finished" podID="873d08e2-d2e6-4785-b807-3b50d758d136" containerID="a1e884e7e9e24f4add4302b865dce946dfece9ae3069db5c85b8eb1c2261fb68" exitCode=0 Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.906392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerDied","Data":"a1e884e7e9e24f4add4302b865dce946dfece9ae3069db5c85b8eb1c2261fb68"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.908791 4895 generic.go:334] "Generic (PLEG): container finished" podID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerID="915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168" exitCode=0 Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.908874 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerDied","Data":"915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.913054 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerStarted","Data":"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa"} Dec 02 07:26:11 crc kubenswrapper[4895]: I1202 07:26:11.917017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerStarted","Data":"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b"} Dec 02 07:26:11 crc kubenswrapper[4895]: W1202 07:26:11.978533 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af25091_1401_45d4_ae53_d2b469c879da.slice/crio-5ab0eb87a2b1fb11397684604fb7101fcb17aa839edac88f29b8a89eb65f0aef WatchSource:0}: Error finding container 5ab0eb87a2b1fb11397684604fb7101fcb17aa839edac88f29b8a89eb65f0aef: Status 404 returned error can't find the container with id 5ab0eb87a2b1fb11397684604fb7101fcb17aa839edac88f29b8a89eb65f0aef Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.927549 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerID="eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.928007 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerDied","Data":"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.931101 4895 generic.go:334] "Generic (PLEG): container finished" podID="c04158de-6693-44c6-82d3-198e545eccfb" containerID="0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.931169 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerDied","Data":"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.935116 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5f88v" event={"ID":"5af25091-1401-45d4-ae53-d2b469c879da","Type":"ContainerStarted","Data":"fbd55f2288d385d22badcde3b8c78aaf134b3f11e189fa94b39593372591bfd8"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.935172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5f88v" event={"ID":"5af25091-1401-45d4-ae53-d2b469c879da","Type":"ContainerStarted","Data":"5ab0eb87a2b1fb11397684604fb7101fcb17aa839edac88f29b8a89eb65f0aef"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.939302 4895 generic.go:334] "Generic (PLEG): container finished" podID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerID="cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.939375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerDied","Data":"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.947089 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerID="7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.954818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerDied","Data":"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.957798 4895 generic.go:334] "Generic (PLEG): container finished" podID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerID="865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.957904 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerDied","Data":"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7"} Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.979007 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerID="53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133" exitCode=0 Dec 02 07:26:12 crc kubenswrapper[4895]: I1202 07:26:12.979087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerDied","Data":"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133"} Dec 02 07:26:13 crc kubenswrapper[4895]: I1202 07:26:13.004817 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qrngv" Dec 02 07:26:13 crc kubenswrapper[4895]: I1202 07:26:13.986768 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5f88v" event={"ID":"5af25091-1401-45d4-ae53-d2b469c879da","Type":"ContainerStarted","Data":"acc37751e6ef31a03cbd33a041db57c516126980d090d8f24b19a06166761bdf"} Dec 02 07:26:14 crc kubenswrapper[4895]: I1202 07:26:14.008548 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5f88v" podStartSLOduration=157.008466814 podStartE2EDuration="2m37.008466814s" podCreationTimestamp="2025-12-02 07:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:14.00300178 +0000 UTC m=+185.173861393" watchObservedRunningTime="2025-12-02 07:26:14.008466814 +0000 UTC m=+185.179326427" Dec 02 07:26:15 crc kubenswrapper[4895]: I1202 07:26:15.999457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerStarted","Data":"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1"} Dec 02 07:26:16 crc kubenswrapper[4895]: I1202 07:26:16.019650 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ptmfz" podStartSLOduration=3.368804265 podStartE2EDuration="37.019628119s" podCreationTimestamp="2025-12-02 07:25:39 +0000 UTC" firstStartedPulling="2025-12-02 07:25:41.330482278 +0000 UTC m=+152.501341891" lastFinishedPulling="2025-12-02 07:26:14.981306132 +0000 UTC m=+186.152165745" observedRunningTime="2025-12-02 07:26:16.015687487 +0000 UTC m=+187.186547100" watchObservedRunningTime="2025-12-02 07:26:16.019628119 +0000 UTC m=+187.190487732" Dec 02 07:26:17 crc kubenswrapper[4895]: I1202 07:26:17.010274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerStarted","Data":"e9c63391b740e861176938282076694c6522d3dd8db1a91cc36181a04a114a9a"} Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.018016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerStarted","Data":"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094"} Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.023793 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerStarted","Data":"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68"} Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.026635 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerStarted","Data":"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a"} Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.040860 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l7hjt" podStartSLOduration=4.716157798 podStartE2EDuration="39.040837405s" podCreationTimestamp="2025-12-02 07:25:39 +0000 UTC" firstStartedPulling="2025-12-02 07:25:41.299036404 +0000 UTC m=+152.469896017" lastFinishedPulling="2025-12-02 07:26:15.623716021 +0000 UTC m=+186.794575624" observedRunningTime="2025-12-02 07:26:17.034592579 +0000 UTC m=+188.205452192" watchObservedRunningTime="2025-12-02 07:26:18.040837405 +0000 UTC m=+189.211697018" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.042835 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5q5hv" podStartSLOduration=3.936421322 podStartE2EDuration="41.042825081s" podCreationTimestamp="2025-12-02 07:25:37 +0000 UTC" firstStartedPulling="2025-12-02 07:25:40.072665577 +0000 UTC m=+151.243525190" lastFinishedPulling="2025-12-02 07:26:17.179069336 +0000 UTC m=+188.349928949" observedRunningTime="2025-12-02 07:26:18.03961006 +0000 UTC m=+189.210469713" watchObservedRunningTime="2025-12-02 07:26:18.042825081 +0000 UTC m=+189.213684694" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.060673 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.061129 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.066392 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wjh4k" podStartSLOduration=3.446171317 podStartE2EDuration="38.066368092s" podCreationTimestamp="2025-12-02 07:25:40 +0000 UTC" firstStartedPulling="2025-12-02 07:25:42.474831581 +0000 UTC m=+153.645691194" lastFinishedPulling="2025-12-02 07:26:17.095028356 +0000 UTC m=+188.265887969" observedRunningTime="2025-12-02 07:26:18.062498313 +0000 UTC m=+189.233357936" watchObservedRunningTime="2025-12-02 07:26:18.066368092 +0000 UTC m=+189.237227705" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.093595 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zbpzj" podStartSLOduration=4.040088354 podStartE2EDuration="41.093567766s" podCreationTimestamp="2025-12-02 07:25:37 +0000 UTC" firstStartedPulling="2025-12-02 07:25:40.025866393 +0000 UTC m=+151.196726006" lastFinishedPulling="2025-12-02 07:26:17.079345805 +0000 UTC m=+188.250205418" observedRunningTime="2025-12-02 07:26:18.090032606 +0000 UTC m=+189.260892219" watchObservedRunningTime="2025-12-02 07:26:18.093567766 +0000 UTC m=+189.264427389" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.375772 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:26:18 crc kubenswrapper[4895]: E1202 07:26:18.376044 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18283c7d-73dc-4e88-99ba-af7557b0d317" containerName="pruner" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.376059 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="18283c7d-73dc-4e88-99ba-af7557b0d317" containerName="pruner" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.376205 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="18283c7d-73dc-4e88-99ba-af7557b0d317" containerName="pruner" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.376613 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.378474 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.379513 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.385776 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.458980 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.459052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.560058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.560142 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.560307 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.722079 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:18 crc kubenswrapper[4895]: I1202 07:26:18.994913 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:19 crc kubenswrapper[4895]: I1202 07:26:19.106645 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-899nx" podStartSLOduration=3.456175976 podStartE2EDuration="42.106623363s" podCreationTimestamp="2025-12-02 07:25:37 +0000 UTC" firstStartedPulling="2025-12-02 07:25:40.23407475 +0000 UTC m=+151.404934363" lastFinishedPulling="2025-12-02 07:26:18.884522137 +0000 UTC m=+190.055381750" observedRunningTime="2025-12-02 07:26:19.103174877 +0000 UTC m=+190.274034490" watchObservedRunningTime="2025-12-02 07:26:19.106623363 +0000 UTC m=+190.277482976" Dec 02 07:26:19 crc kubenswrapper[4895]: I1202 07:26:19.129418 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zbpzj" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="registry-server" probeResult="failure" output=< Dec 02 07:26:19 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:26:19 crc kubenswrapper[4895]: > Dec 02 07:26:19 crc kubenswrapper[4895]: I1202 07:26:19.529589 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.002649 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.003065 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.061119 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.090087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d7771c5f-b225-4f42-91c1-f99e07aac262","Type":"ContainerStarted","Data":"506c6940840905742c9a0dd61d131760e2d3f9162aa85a56049f99fa8457c638"} Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.090145 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d7771c5f-b225-4f42-91c1-f99e07aac262","Type":"ContainerStarted","Data":"04340c62efa29d6beab41e39409df9844e7a1d1e2d54835a38902a11a6146adc"} Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.098174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerStarted","Data":"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff"} Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.104208 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerStarted","Data":"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927"} Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.107929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerStarted","Data":"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0"} Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.121041 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.121018138 podStartE2EDuration="2.121018138s" podCreationTimestamp="2025-12-02 07:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:20.119282909 +0000 UTC m=+191.290142522" watchObservedRunningTime="2025-12-02 07:26:20.121018138 +0000 UTC m=+191.291877751" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.146688 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5c5n7" podStartSLOduration=3.729162344 podStartE2EDuration="40.146663288s" podCreationTimestamp="2025-12-02 07:25:40 +0000 UTC" firstStartedPulling="2025-12-02 07:25:42.475057988 +0000 UTC m=+153.645917601" lastFinishedPulling="2025-12-02 07:26:18.892558932 +0000 UTC m=+190.063418545" observedRunningTime="2025-12-02 07:26:20.141846963 +0000 UTC m=+191.312706596" watchObservedRunningTime="2025-12-02 07:26:20.146663288 +0000 UTC m=+191.317522901" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.154607 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.164795 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qsldg" podStartSLOduration=4.453705477 podStartE2EDuration="43.164778806s" podCreationTimestamp="2025-12-02 07:25:37 +0000 UTC" firstStartedPulling="2025-12-02 07:25:40.255773079 +0000 UTC m=+151.426632692" lastFinishedPulling="2025-12-02 07:26:18.966846408 +0000 UTC m=+190.137706021" observedRunningTime="2025-12-02 07:26:20.159816487 +0000 UTC m=+191.330676110" watchObservedRunningTime="2025-12-02 07:26:20.164778806 +0000 UTC m=+191.335638419" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.405121 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.405191 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.457759 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.857546 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.933924 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:26:20 crc kubenswrapper[4895]: I1202 07:26:20.933993 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.115563 4895 generic.go:334] "Generic (PLEG): container finished" podID="d7771c5f-b225-4f42-91c1-f99e07aac262" containerID="506c6940840905742c9a0dd61d131760e2d3f9162aa85a56049f99fa8457c638" exitCode=0 Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.115657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d7771c5f-b225-4f42-91c1-f99e07aac262","Type":"ContainerDied","Data":"506c6940840905742c9a0dd61d131760e2d3f9162aa85a56049f99fa8457c638"} Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.181361 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.304452 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.304532 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:21 crc kubenswrapper[4895]: I1202 07:26:21.376370 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.010791 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wjh4k" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="registry-server" probeResult="failure" output=< Dec 02 07:26:22 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:26:22 crc kubenswrapper[4895]: > Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.342132 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5c5n7" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="registry-server" probeResult="failure" output=< Dec 02 07:26:22 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:26:22 crc kubenswrapper[4895]: > Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.376427 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.523655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access\") pod \"d7771c5f-b225-4f42-91c1-f99e07aac262\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.523706 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir\") pod \"d7771c5f-b225-4f42-91c1-f99e07aac262\" (UID: \"d7771c5f-b225-4f42-91c1-f99e07aac262\") " Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.523846 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d7771c5f-b225-4f42-91c1-f99e07aac262" (UID: "d7771c5f-b225-4f42-91c1-f99e07aac262"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.524035 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7771c5f-b225-4f42-91c1-f99e07aac262-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.530533 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d7771c5f-b225-4f42-91c1-f99e07aac262" (UID: "d7771c5f-b225-4f42-91c1-f99e07aac262"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:22 crc kubenswrapper[4895]: I1202 07:26:22.625692 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7771c5f-b225-4f42-91c1-f99e07aac262-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:23 crc kubenswrapper[4895]: I1202 07:26:23.130715 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d7771c5f-b225-4f42-91c1-f99e07aac262","Type":"ContainerDied","Data":"04340c62efa29d6beab41e39409df9844e7a1d1e2d54835a38902a11a6146adc"} Dec 02 07:26:23 crc kubenswrapper[4895]: I1202 07:26:23.130757 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:26:23 crc kubenswrapper[4895]: I1202 07:26:23.130780 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04340c62efa29d6beab41e39409df9844e7a1d1e2d54835a38902a11a6146adc" Dec 02 07:26:24 crc kubenswrapper[4895]: I1202 07:26:24.376640 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:26:24 crc kubenswrapper[4895]: I1202 07:26:24.377491 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l7hjt" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="registry-server" containerID="cri-o://e9c63391b740e861176938282076694c6522d3dd8db1a91cc36181a04a114a9a" gracePeriod=2 Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.370844 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:26:25 crc kubenswrapper[4895]: E1202 07:26:25.371191 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7771c5f-b225-4f42-91c1-f99e07aac262" containerName="pruner" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.371208 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7771c5f-b225-4f42-91c1-f99e07aac262" containerName="pruner" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.371336 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7771c5f-b225-4f42-91c1-f99e07aac262" containerName="pruner" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.371916 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.374275 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.377603 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.382989 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.570992 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.571071 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.571847 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.673364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.673437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.673503 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.673612 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.673624 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:25 crc kubenswrapper[4895]: I1202 07:26:25.699725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:26 crc kubenswrapper[4895]: I1202 07:26:26.000240 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:26:26 crc kubenswrapper[4895]: I1202 07:26:26.264895 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:26:26 crc kubenswrapper[4895]: W1202 07:26:26.271939 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b0aeead_bbd7_4ba2_901f_2aa5be9899b3.slice/crio-ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc WatchSource:0}: Error finding container ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc: Status 404 returned error can't find the container with id ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc Dec 02 07:26:27 crc kubenswrapper[4895]: I1202 07:26:27.158922 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3","Type":"ContainerStarted","Data":"ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc"} Dec 02 07:26:27 crc kubenswrapper[4895]: I1202 07:26:27.808199 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:26:27 crc kubenswrapper[4895]: I1202 07:26:27.808608 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:26:27 crc kubenswrapper[4895]: I1202 07:26:27.872162 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.122370 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.164477 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.166860 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3","Type":"ContainerStarted","Data":"7ee6d11c2a73825aad67b7df1ba6718de0696089e8f3fe76d1f59bc77586cae8"} Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.169879 4895 generic.go:334] "Generic (PLEG): container finished" podID="873d08e2-d2e6-4785-b807-3b50d758d136" containerID="e9c63391b740e861176938282076694c6522d3dd8db1a91cc36181a04a114a9a" exitCode=0 Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.170037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerDied","Data":"e9c63391b740e861176938282076694c6522d3dd8db1a91cc36181a04a114a9a"} Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.202826 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.202928 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.220807 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.220774782 podStartE2EDuration="3.220774782s" podCreationTimestamp="2025-12-02 07:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:28.213940218 +0000 UTC m=+199.384799911" watchObservedRunningTime="2025-12-02 07:26:28.220774782 +0000 UTC m=+199.391634455" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.224082 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.264778 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.345880 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.346185 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.395614 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.573407 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.725081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities\") pod \"873d08e2-d2e6-4785-b807-3b50d758d136\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.725387 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content\") pod \"873d08e2-d2e6-4785-b807-3b50d758d136\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.726090 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities" (OuterVolumeSpecName: "utilities") pod "873d08e2-d2e6-4785-b807-3b50d758d136" (UID: "873d08e2-d2e6-4785-b807-3b50d758d136"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.728954 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gwr\" (UniqueName: \"kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr\") pod \"873d08e2-d2e6-4785-b807-3b50d758d136\" (UID: \"873d08e2-d2e6-4785-b807-3b50d758d136\") " Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.729657 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.738101 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr" (OuterVolumeSpecName: "kube-api-access-x4gwr") pod "873d08e2-d2e6-4785-b807-3b50d758d136" (UID: "873d08e2-d2e6-4785-b807-3b50d758d136"). InnerVolumeSpecName "kube-api-access-x4gwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.750644 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "873d08e2-d2e6-4785-b807-3b50d758d136" (UID: "873d08e2-d2e6-4785-b807-3b50d758d136"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.831494 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873d08e2-d2e6-4785-b807-3b50d758d136-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:28 crc kubenswrapper[4895]: I1202 07:26:28.831550 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gwr\" (UniqueName: \"kubernetes.io/projected/873d08e2-d2e6-4785-b807-3b50d758d136-kube-api-access-x4gwr\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.183999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7hjt" event={"ID":"873d08e2-d2e6-4785-b807-3b50d758d136","Type":"ContainerDied","Data":"7344ada898fe7c120a55256d02448c442c26f0d825b6ce7007290f712abe6389"} Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.184566 4895 scope.go:117] "RemoveContainer" containerID="e9c63391b740e861176938282076694c6522d3dd8db1a91cc36181a04a114a9a" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.185296 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7hjt" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.209627 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.213473 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7hjt"] Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.215103 4895 scope.go:117] "RemoveContainer" containerID="a1e884e7e9e24f4add4302b865dce946dfece9ae3069db5c85b8eb1c2261fb68" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.233653 4895 scope.go:117] "RemoveContainer" containerID="2dc57c9ac27f3fadacbb0fd0e7ca45a937372ef220adc6d40703b2fe9b58b2f4" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.236460 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:29 crc kubenswrapper[4895]: I1202 07:26:29.273039 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:30 crc kubenswrapper[4895]: I1202 07:26:30.381623 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:26:30 crc kubenswrapper[4895]: I1202 07:26:30.989125 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:26:31 crc kubenswrapper[4895]: I1202 07:26:31.050693 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:26:31 crc kubenswrapper[4895]: I1202 07:26:31.165353 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" path="/var/lib/kubelet/pods/873d08e2-d2e6-4785-b807-3b50d758d136/volumes" Dec 02 07:26:31 crc kubenswrapper[4895]: I1202 07:26:31.353350 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:31 crc kubenswrapper[4895]: I1202 07:26:31.407265 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.207096 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qsldg" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="registry-server" containerID="cri-o://50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0" gracePeriod=2 Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.577368 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.698328 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities\") pod \"c04158de-6693-44c6-82d3-198e545eccfb\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.698446 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content\") pod \"c04158de-6693-44c6-82d3-198e545eccfb\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.698584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnvj2\" (UniqueName: \"kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2\") pod \"c04158de-6693-44c6-82d3-198e545eccfb\" (UID: \"c04158de-6693-44c6-82d3-198e545eccfb\") " Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.704612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities" (OuterVolumeSpecName: "utilities") pod "c04158de-6693-44c6-82d3-198e545eccfb" (UID: "c04158de-6693-44c6-82d3-198e545eccfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.706444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2" (OuterVolumeSpecName: "kube-api-access-rnvj2") pod "c04158de-6693-44c6-82d3-198e545eccfb" (UID: "c04158de-6693-44c6-82d3-198e545eccfb"). InnerVolumeSpecName "kube-api-access-rnvj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.754852 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c04158de-6693-44c6-82d3-198e545eccfb" (UID: "c04158de-6693-44c6-82d3-198e545eccfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.782667 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.783456 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-899nx" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="registry-server" containerID="cri-o://55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927" gracePeriod=2 Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.800753 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnvj2\" (UniqueName: \"kubernetes.io/projected/c04158de-6693-44c6-82d3-198e545eccfb-kube-api-access-rnvj2\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.800810 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:32 crc kubenswrapper[4895]: I1202 07:26:32.800827 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04158de-6693-44c6-82d3-198e545eccfb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.148254 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.215101 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerID="55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927" exitCode=0 Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.215205 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerDied","Data":"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927"} Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.215210 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-899nx" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.215249 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-899nx" event={"ID":"e6040fed-14e6-49a5-802b-e49bfeba7aa5","Type":"ContainerDied","Data":"088385ce913bf789e7bb655af7c4729ddc13523d74df1332a7e7007a588752bd"} Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.215273 4895 scope.go:117] "RemoveContainer" containerID="55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.217922 4895 generic.go:334] "Generic (PLEG): container finished" podID="c04158de-6693-44c6-82d3-198e545eccfb" containerID="50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0" exitCode=0 Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.217966 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerDied","Data":"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0"} Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.218030 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsldg" event={"ID":"c04158de-6693-44c6-82d3-198e545eccfb","Type":"ContainerDied","Data":"0a3cd8cd6db0b54a499191102c15267ae4ad43caec27eae97d24967d297ad308"} Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.218045 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsldg" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.229616 4895 scope.go:117] "RemoveContainer" containerID="53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.240146 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.242571 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qsldg"] Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.249538 4895 scope.go:117] "RemoveContainer" containerID="e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.261324 4895 scope.go:117] "RemoveContainer" containerID="55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.261691 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927\": container with ID starting with 55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927 not found: ID does not exist" containerID="55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.261771 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927"} err="failed to get container status \"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927\": rpc error: code = NotFound desc = could not find container \"55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927\": container with ID starting with 55f37d0d9fb5125f472da6794ee4a354bd39cd98bc6a207f1d0c2e5f46e8a927 not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.261827 4895 scope.go:117] "RemoveContainer" containerID="53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.262086 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133\": container with ID starting with 53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133 not found: ID does not exist" containerID="53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.262116 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133"} err="failed to get container status \"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133\": rpc error: code = NotFound desc = could not find container \"53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133\": container with ID starting with 53a7c2e197dcf72ea309318d293d8c2cc646941989a1f0a1d435fb3d45b51133 not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.262132 4895 scope.go:117] "RemoveContainer" containerID="e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.262551 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c\": container with ID starting with e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c not found: ID does not exist" containerID="e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.262581 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c"} err="failed to get container status \"e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c\": rpc error: code = NotFound desc = could not find container \"e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c\": container with ID starting with e87e9632a2248f19a185548fc87db3689bafb81a24a8fec00ed8bac605a3958c not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.262606 4895 scope.go:117] "RemoveContainer" containerID="50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.274363 4895 scope.go:117] "RemoveContainer" containerID="0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.286605 4895 scope.go:117] "RemoveContainer" containerID="3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.298760 4895 scope.go:117] "RemoveContainer" containerID="50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.299213 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0\": container with ID starting with 50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0 not found: ID does not exist" containerID="50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.299287 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0"} err="failed to get container status \"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0\": rpc error: code = NotFound desc = could not find container \"50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0\": container with ID starting with 50994102f6100ea2c54c8fa39da0b73b0724686c597db3228c753dfaf01062f0 not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.299347 4895 scope.go:117] "RemoveContainer" containerID="0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.299673 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b\": container with ID starting with 0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b not found: ID does not exist" containerID="0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.299704 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b"} err="failed to get container status \"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b\": rpc error: code = NotFound desc = could not find container \"0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b\": container with ID starting with 0d070e440f060c89ce0b25a1b510e18419c14f0fc77669a518fbb61c934ca72b not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.299725 4895 scope.go:117] "RemoveContainer" containerID="3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29" Dec 02 07:26:33 crc kubenswrapper[4895]: E1202 07:26:33.300057 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29\": container with ID starting with 3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29 not found: ID does not exist" containerID="3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.300115 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29"} err="failed to get container status \"3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29\": rpc error: code = NotFound desc = could not find container \"3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29\": container with ID starting with 3909dd207d8ca37be457232d1d2d5b4fddb299544647d1c6aad8d25122d04b29 not found: ID does not exist" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.307860 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdc82\" (UniqueName: \"kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82\") pod \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.308019 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities\") pod \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.308040 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content\") pod \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\" (UID: \"e6040fed-14e6-49a5-802b-e49bfeba7aa5\") " Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.309057 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities" (OuterVolumeSpecName: "utilities") pod "e6040fed-14e6-49a5-802b-e49bfeba7aa5" (UID: "e6040fed-14e6-49a5-802b-e49bfeba7aa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.311965 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82" (OuterVolumeSpecName: "kube-api-access-bdc82") pod "e6040fed-14e6-49a5-802b-e49bfeba7aa5" (UID: "e6040fed-14e6-49a5-802b-e49bfeba7aa5"). InnerVolumeSpecName "kube-api-access-bdc82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.351876 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6040fed-14e6-49a5-802b-e49bfeba7aa5" (UID: "e6040fed-14e6-49a5-802b-e49bfeba7aa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.409607 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.409647 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6040fed-14e6-49a5-802b-e49bfeba7aa5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.409660 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdc82\" (UniqueName: \"kubernetes.io/projected/e6040fed-14e6-49a5-802b-e49bfeba7aa5-kube-api-access-bdc82\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.540443 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:26:33 crc kubenswrapper[4895]: I1202 07:26:33.544150 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-899nx"] Dec 02 07:26:34 crc kubenswrapper[4895]: I1202 07:26:34.631703 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:26:34 crc kubenswrapper[4895]: I1202 07:26:34.633234 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerName="controller-manager" containerID="cri-o://e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68" gracePeriod=30 Dec 02 07:26:34 crc kubenswrapper[4895]: I1202 07:26:34.721319 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:26:34 crc kubenswrapper[4895]: I1202 07:26:34.721615 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerName="route-controller-manager" containerID="cri-o://dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168" gracePeriod=30 Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.022406 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.068655 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.140194 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtd8j\" (UniqueName: \"kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j\") pod \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.140312 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config\") pod \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.140373 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert\") pod \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.140402 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles\") pod \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.140458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca\") pod \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\" (UID: \"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.141215 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca" (OuterVolumeSpecName: "client-ca") pod "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" (UID: "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.141225 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" (UID: "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.141388 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config" (OuterVolumeSpecName: "config") pod "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" (UID: "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.146104 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" (UID: "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.146223 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j" (OuterVolumeSpecName: "kube-api-access-wtd8j") pod "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" (UID: "902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73"). InnerVolumeSpecName "kube-api-access-wtd8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.154006 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04158de-6693-44c6-82d3-198e545eccfb" path="/var/lib/kubelet/pods/c04158de-6693-44c6-82d3-198e545eccfb/volumes" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.155276 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" path="/var/lib/kubelet/pods/e6040fed-14e6-49a5-802b-e49bfeba7aa5/volumes" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.232652 4895 generic.go:334] "Generic (PLEG): container finished" podID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerID="e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68" exitCode=0 Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.232771 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" event={"ID":"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73","Type":"ContainerDied","Data":"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68"} Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.232823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" event={"ID":"902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73","Type":"ContainerDied","Data":"857cf1f2927f712ff263c79fc39238cee0fbc4256252c1abdfb94488b1ae30de"} Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.232850 4895 scope.go:117] "RemoveContainer" containerID="e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.233012 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rd9pn" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.239404 4895 generic.go:334] "Generic (PLEG): container finished" podID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerID="dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168" exitCode=0 Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.239471 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" event={"ID":"adee7e4a-d71b-4efc-b3fa-6e3ece833722","Type":"ContainerDied","Data":"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168"} Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.239527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" event={"ID":"adee7e4a-d71b-4efc-b3fa-6e3ece833722","Type":"ContainerDied","Data":"97c318a4a8350ed22c9957c16973494812c6b50b742ea1eaad55560480ff2b12"} Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.239445 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241257 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config\") pod \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241297 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert\") pod \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241442 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxkff\" (UniqueName: \"kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff\") pod \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241466 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca\") pod \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\" (UID: \"adee7e4a-d71b-4efc-b3fa-6e3ece833722\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241701 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtd8j\" (UniqueName: \"kubernetes.io/projected/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-kube-api-access-wtd8j\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241718 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241729 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241754 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.241763 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.242193 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca" (OuterVolumeSpecName: "client-ca") pod "adee7e4a-d71b-4efc-b3fa-6e3ece833722" (UID: "adee7e4a-d71b-4efc-b3fa-6e3ece833722"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.242218 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config" (OuterVolumeSpecName: "config") pod "adee7e4a-d71b-4efc-b3fa-6e3ece833722" (UID: "adee7e4a-d71b-4efc-b3fa-6e3ece833722"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.246836 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "adee7e4a-d71b-4efc-b3fa-6e3ece833722" (UID: "adee7e4a-d71b-4efc-b3fa-6e3ece833722"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.247458 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff" (OuterVolumeSpecName: "kube-api-access-pxkff") pod "adee7e4a-d71b-4efc-b3fa-6e3ece833722" (UID: "adee7e4a-d71b-4efc-b3fa-6e3ece833722"). InnerVolumeSpecName "kube-api-access-pxkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.263299 4895 scope.go:117] "RemoveContainer" containerID="e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68" Dec 02 07:26:35 crc kubenswrapper[4895]: E1202 07:26:35.264460 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68\": container with ID starting with e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68 not found: ID does not exist" containerID="e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.264506 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68"} err="failed to get container status \"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68\": rpc error: code = NotFound desc = could not find container \"e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68\": container with ID starting with e8c99f357d871d6a97e40e0af79a55dd840c49b505c152139af531c3676efa68 not found: ID does not exist" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.264538 4895 scope.go:117] "RemoveContainer" containerID="dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.267531 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.272502 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rd9pn"] Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.280772 4895 scope.go:117] "RemoveContainer" containerID="dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168" Dec 02 07:26:35 crc kubenswrapper[4895]: E1202 07:26:35.281578 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168\": container with ID starting with dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168 not found: ID does not exist" containerID="dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.281642 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168"} err="failed to get container status \"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168\": rpc error: code = NotFound desc = could not find container \"dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168\": container with ID starting with dc019b35034da15881d99ce2faf0303536724aa744716b9ddee14e32c2dca168 not found: ID does not exist" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.342946 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.343009 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adee7e4a-d71b-4efc-b3fa-6e3ece833722-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.343035 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxkff\" (UniqueName: \"kubernetes.io/projected/adee7e4a-d71b-4efc-b3fa-6e3ece833722-kube-api-access-pxkff\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.343060 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adee7e4a-d71b-4efc-b3fa-6e3ece833722-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.380966 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.381850 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5c5n7" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="registry-server" containerID="cri-o://cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff" gracePeriod=2 Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.473130 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.473192 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.617761 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.621863 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jd7nh"] Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.739941 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.849153 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities\") pod \"459e9e10-320f-47aa-901c-cfd9ec241a67\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.849310 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content\") pod \"459e9e10-320f-47aa-901c-cfd9ec241a67\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.849462 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pp7s\" (UniqueName: \"kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s\") pod \"459e9e10-320f-47aa-901c-cfd9ec241a67\" (UID: \"459e9e10-320f-47aa-901c-cfd9ec241a67\") " Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.850591 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities" (OuterVolumeSpecName: "utilities") pod "459e9e10-320f-47aa-901c-cfd9ec241a67" (UID: "459e9e10-320f-47aa-901c-cfd9ec241a67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.859002 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s" (OuterVolumeSpecName: "kube-api-access-5pp7s") pod "459e9e10-320f-47aa-901c-cfd9ec241a67" (UID: "459e9e10-320f-47aa-901c-cfd9ec241a67"). InnerVolumeSpecName "kube-api-access-5pp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.951332 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pp7s\" (UniqueName: \"kubernetes.io/projected/459e9e10-320f-47aa-901c-cfd9ec241a67-kube-api-access-5pp7s\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.951500 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:35 crc kubenswrapper[4895]: I1202 07:26:35.964043 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "459e9e10-320f-47aa-901c-cfd9ec241a67" (UID: "459e9e10-320f-47aa-901c-cfd9ec241a67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.053540 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459e9e10-320f-47aa-901c-cfd9ec241a67-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.251281 4895 generic.go:334] "Generic (PLEG): container finished" podID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerID="cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff" exitCode=0 Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.251376 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5c5n7" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.251375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerDied","Data":"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff"} Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.251456 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5c5n7" event={"ID":"459e9e10-320f-47aa-901c-cfd9ec241a67","Type":"ContainerDied","Data":"46abadcc64129015fd5c7420ee2ec9a3f9250289950d5b5c2c2d6bdc5539373f"} Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.251485 4895 scope.go:117] "RemoveContainer" containerID="cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.272888 4895 scope.go:117] "RemoveContainer" containerID="865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.304321 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.308593 4895 scope.go:117] "RemoveContainer" containerID="26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.308808 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5c5n7"] Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.329008 4895 scope.go:117] "RemoveContainer" containerID="cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.329917 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff\": container with ID starting with cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff not found: ID does not exist" containerID="cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.329975 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff"} err="failed to get container status \"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff\": rpc error: code = NotFound desc = could not find container \"cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff\": container with ID starting with cc63c619fcd7e26d5fcbe45974ac2bf10caa99e494be1c18dd34368cab1318ff not found: ID does not exist" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.330021 4895 scope.go:117] "RemoveContainer" containerID="865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.330616 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7\": container with ID starting with 865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7 not found: ID does not exist" containerID="865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.330655 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7"} err="failed to get container status \"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7\": rpc error: code = NotFound desc = could not find container \"865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7\": container with ID starting with 865e21c7f8914590c1a5da8974ff773a758097cd1943c7c238c4618674c218c7 not found: ID does not exist" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.330685 4895 scope.go:117] "RemoveContainer" containerID="26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.331607 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e\": container with ID starting with 26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e not found: ID does not exist" containerID="26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.331683 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e"} err="failed to get container status \"26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e\": rpc error: code = NotFound desc = could not find container \"26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e\": container with ID starting with 26d9c0eb7234231a93f0a3f41e16f51691fa3ee433a4f6491e0a58ca39f8f39e not found: ID does not exist" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.347827 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348542 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348604 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348626 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348672 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348687 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348698 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348712 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348773 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348788 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348798 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348848 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348859 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348872 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerName="route-controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348882 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerName="route-controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348926 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.348937 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.348958 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerName="controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349002 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerName="controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.349027 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349039 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="extract-utilities" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.349079 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349093 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.349111 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349123 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="extract-content" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.349176 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349189 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: E1202 07:26:36.349200 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349210 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349479 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="873d08e2-d2e6-4785-b807-3b50d758d136" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349503 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6040fed-14e6-49a5-802b-e49bfeba7aa5" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349519 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" containerName="controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349534 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349553 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04158de-6693-44c6-82d3-198e545eccfb" containerName="registry-server" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.349565 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" containerName="route-controller-manager" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.350420 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.355106 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.356698 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.359342 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.359818 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.360617 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.360775 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.361019 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.363703 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.365017 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.365199 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.365343 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.365579 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.366314 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.366392 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.394713 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.395139 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.404848 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470285 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446vz\" (UniqueName: \"kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470367 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470429 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470536 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhg5m\" (UniqueName: \"kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.470717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.572556 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573090 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-446vz\" (UniqueName: \"kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573125 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573186 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573721 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.573805 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhg5m\" (UniqueName: \"kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.575171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.575207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.582802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.584960 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.586029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.587930 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.588278 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.593696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhg5m\" (UniqueName: \"kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m\") pod \"controller-manager-7f7bf5bf79-2xct5\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.598198 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-446vz\" (UniqueName: \"kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz\") pod \"route-controller-manager-67485fb985-x7mxq\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.688801 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.701828 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:36 crc kubenswrapper[4895]: I1202 07:26:36.964463 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.004869 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.149819 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459e9e10-320f-47aa-901c-cfd9ec241a67" path="/var/lib/kubelet/pods/459e9e10-320f-47aa-901c-cfd9ec241a67/volumes" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.150988 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73" path="/var/lib/kubelet/pods/902734d2-7cd4-4f9b-ab9e-72cc5d8f3b73/volumes" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.151580 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adee7e4a-d71b-4efc-b3fa-6e3ece833722" path="/var/lib/kubelet/pods/adee7e4a-d71b-4efc-b3fa-6e3ece833722/volumes" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.260019 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" event={"ID":"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8","Type":"ContainerStarted","Data":"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb"} Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.260071 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" event={"ID":"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8","Type":"ContainerStarted","Data":"a32ed8e04ce3bab36ce56026a911cc3a40d338ab3ae9607b06e46008ecd251dc"} Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.260911 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.262289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" event={"ID":"2d8db6b7-b02b-49af-90df-46c045e6664e","Type":"ContainerStarted","Data":"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb"} Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.262322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" event={"ID":"2d8db6b7-b02b-49af-90df-46c045e6664e","Type":"ContainerStarted","Data":"6a38869b457fef8bdb9952b530f496a26a0202edfde6739d248f662d65a7b213"} Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.263186 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.272156 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.282552 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" podStartSLOduration=3.282528722 podStartE2EDuration="3.282528722s" podCreationTimestamp="2025-12-02 07:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:37.281413819 +0000 UTC m=+208.452273482" watchObservedRunningTime="2025-12-02 07:26:37.282528722 +0000 UTC m=+208.453388335" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.302496 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" podStartSLOduration=3.302460543 podStartE2EDuration="3.302460543s" podCreationTimestamp="2025-12-02 07:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:37.301118333 +0000 UTC m=+208.471977966" watchObservedRunningTime="2025-12-02 07:26:37.302460543 +0000 UTC m=+208.473320166" Dec 02 07:26:37 crc kubenswrapper[4895]: I1202 07:26:37.443614 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:45 crc kubenswrapper[4895]: I1202 07:26:45.909449 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerName="oauth-openshift" containerID="cri-o://74f5fc16818348c26930385a8db0d71ea56ec440354160d2f376b748aa55b78e" gracePeriod=15 Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.326559 4895 generic.go:334] "Generic (PLEG): container finished" podID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerID="74f5fc16818348c26930385a8db0d71ea56ec440354160d2f376b748aa55b78e" exitCode=0 Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.326894 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" event={"ID":"1adeefcd-490c-4913-8315-baa7dbc1e7a9","Type":"ContainerDied","Data":"74f5fc16818348c26930385a8db0d71ea56ec440354160d2f376b748aa55b78e"} Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.406063 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.444984 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445044 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445080 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445113 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445150 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445179 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445222 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.445404 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt24s\" (UniqueName: \"kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446481 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446595 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446629 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446659 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446687 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data\") pod \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\" (UID: \"1adeefcd-490c-4913-8315-baa7dbc1e7a9\") " Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.447313 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.446717 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.447060 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.447073 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.452542 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s" (OuterVolumeSpecName: "kube-api-access-zt24s") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "kube-api-access-zt24s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.457002 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.457127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.457401 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.457624 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.457947 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.458225 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.459450 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.464945 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1adeefcd-490c-4913-8315-baa7dbc1e7a9" (UID: "1adeefcd-490c-4913-8315-baa7dbc1e7a9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.549413 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.549453 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.549466 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.549481 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt24s\" (UniqueName: \"kubernetes.io/projected/1adeefcd-490c-4913-8315-baa7dbc1e7a9-kube-api-access-zt24s\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551722 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551820 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551842 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551862 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551890 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551906 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551921 4895 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551938 4895 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1adeefcd-490c-4913-8315-baa7dbc1e7a9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:46 crc kubenswrapper[4895]: I1202 07:26:46.551962 4895 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1adeefcd-490c-4913-8315-baa7dbc1e7a9-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:47 crc kubenswrapper[4895]: I1202 07:26:47.338091 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" event={"ID":"1adeefcd-490c-4913-8315-baa7dbc1e7a9","Type":"ContainerDied","Data":"afa32e77707ff8dbc0a3a6ca876066c0dac9e5d1e8c58df2650ed95b67c222f0"} Dec 02 07:26:47 crc kubenswrapper[4895]: I1202 07:26:47.338206 4895 scope.go:117] "RemoveContainer" containerID="74f5fc16818348c26930385a8db0d71ea56ec440354160d2f376b748aa55b78e" Dec 02 07:26:47 crc kubenswrapper[4895]: I1202 07:26:47.338235 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nv29v" Dec 02 07:26:47 crc kubenswrapper[4895]: I1202 07:26:47.374488 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:26:47 crc kubenswrapper[4895]: I1202 07:26:47.379494 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nv29v"] Dec 02 07:26:49 crc kubenswrapper[4895]: I1202 07:26:49.147350 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" path="/var/lib/kubelet/pods/1adeefcd-490c-4913-8315-baa7dbc1e7a9/volumes" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.351479 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9"] Dec 02 07:26:51 crc kubenswrapper[4895]: E1202 07:26:51.353170 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerName="oauth-openshift" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.353288 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerName="oauth-openshift" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.353516 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adeefcd-490c-4913-8315-baa7dbc1e7a9" containerName="oauth-openshift" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.354217 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.357803 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.358299 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.358305 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.358310 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.358682 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.358927 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.359672 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.360713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.360877 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.361293 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.361694 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.366600 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.370091 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.371100 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.381946 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.388816 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9"] Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.419631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.419681 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.419706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.419751 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.419933 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-session\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420226 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f730f49-ea4f-48a2-9849-660bf2583047-audit-dir\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-audit-policies\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420318 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420404 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.420435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm5rb\" (UniqueName: \"kubernetes.io/projected/1f730f49-ea4f-48a2-9849-660bf2583047-kube-api-access-vm5rb\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.521881 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm5rb\" (UniqueName: \"kubernetes.io/projected/1f730f49-ea4f-48a2-9849-660bf2583047-kube-api-access-vm5rb\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522282 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522374 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522626 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-session\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.522816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523261 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523363 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f730f49-ea4f-48a2-9849-660bf2583047-audit-dir\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523452 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-audit-policies\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.523641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f730f49-ea4f-48a2-9849-660bf2583047-audit-dir\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.524017 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.524174 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-service-ca\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.524708 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-audit-policies\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.525200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.527478 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.527680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.527830 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-login\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.527970 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.532040 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.532642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-router-certs\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.533448 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-user-template-error\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.536714 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f730f49-ea4f-48a2-9849-660bf2583047-v4-0-config-system-session\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.541931 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm5rb\" (UniqueName: \"kubernetes.io/projected/1f730f49-ea4f-48a2-9849-660bf2583047-kube-api-access-vm5rb\") pod \"oauth-openshift-7f8f9bcd8d-xp9f9\" (UID: \"1f730f49-ea4f-48a2-9849-660bf2583047\") " pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:51 crc kubenswrapper[4895]: I1202 07:26:51.678762 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:52 crc kubenswrapper[4895]: I1202 07:26:52.090581 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9"] Dec 02 07:26:52 crc kubenswrapper[4895]: W1202 07:26:52.098383 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f730f49_ea4f_48a2_9849_660bf2583047.slice/crio-9c9c5b1ec90ef95c1b84b6b3673cc8b5565dad69d25f98cc31debe23bd0b9816 WatchSource:0}: Error finding container 9c9c5b1ec90ef95c1b84b6b3673cc8b5565dad69d25f98cc31debe23bd0b9816: Status 404 returned error can't find the container with id 9c9c5b1ec90ef95c1b84b6b3673cc8b5565dad69d25f98cc31debe23bd0b9816 Dec 02 07:26:52 crc kubenswrapper[4895]: I1202 07:26:52.369329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" event={"ID":"1f730f49-ea4f-48a2-9849-660bf2583047","Type":"ContainerStarted","Data":"9c9c5b1ec90ef95c1b84b6b3673cc8b5565dad69d25f98cc31debe23bd0b9816"} Dec 02 07:26:53 crc kubenswrapper[4895]: I1202 07:26:53.381324 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" event={"ID":"1f730f49-ea4f-48a2-9849-660bf2583047","Type":"ContainerStarted","Data":"4ab6126c4e80767eaf219a7bdf6d11b9b88fd45a3c63e6975bae500f99d5db61"} Dec 02 07:26:53 crc kubenswrapper[4895]: I1202 07:26:53.381906 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:53 crc kubenswrapper[4895]: I1202 07:26:53.391380 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" Dec 02 07:26:53 crc kubenswrapper[4895]: I1202 07:26:53.419454 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" podStartSLOduration=33.419425141 podStartE2EDuration="33.419425141s" podCreationTimestamp="2025-12-02 07:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:53.413941928 +0000 UTC m=+224.584801561" watchObservedRunningTime="2025-12-02 07:26:53.419425141 +0000 UTC m=+224.590284794" Dec 02 07:26:54 crc kubenswrapper[4895]: I1202 07:26:54.604273 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:54 crc kubenswrapper[4895]: I1202 07:26:54.604616 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" podUID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" containerName="controller-manager" containerID="cri-o://9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb" gracePeriod=30 Dec 02 07:26:54 crc kubenswrapper[4895]: I1202 07:26:54.607292 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:54 crc kubenswrapper[4895]: I1202 07:26:54.607524 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" podUID="2d8db6b7-b02b-49af-90df-46c045e6664e" containerName="route-controller-manager" containerID="cri-o://614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb" gracePeriod=30 Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.174530 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.374134 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.391939 4895 generic.go:334] "Generic (PLEG): container finished" podID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" containerID="9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb" exitCode=0 Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.392017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" event={"ID":"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8","Type":"ContainerDied","Data":"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb"} Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.392052 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" event={"ID":"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8","Type":"ContainerDied","Data":"a32ed8e04ce3bab36ce56026a911cc3a40d338ab3ae9607b06e46008ecd251dc"} Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.392074 4895 scope.go:117] "RemoveContainer" containerID="9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.392233 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.408657 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d8db6b7-b02b-49af-90df-46c045e6664e" containerID="614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb" exitCode=0 Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.410865 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.410862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" event={"ID":"2d8db6b7-b02b-49af-90df-46c045e6664e","Type":"ContainerDied","Data":"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb"} Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.410916 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq" event={"ID":"2d8db6b7-b02b-49af-90df-46c045e6664e","Type":"ContainerDied","Data":"6a38869b457fef8bdb9952b530f496a26a0202edfde6739d248f662d65a7b213"} Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.430505 4895 scope.go:117] "RemoveContainer" containerID="9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb" Dec 02 07:26:55 crc kubenswrapper[4895]: E1202 07:26:55.431408 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb\": container with ID starting with 9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb not found: ID does not exist" containerID="9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.431460 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb"} err="failed to get container status \"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb\": rpc error: code = NotFound desc = could not find container \"9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb\": container with ID starting with 9d2edb809dc84d73735ad47ca8e65e7a58caacf5283a1fd3411f34c3d78955bb not found: ID does not exist" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.431512 4895 scope.go:117] "RemoveContainer" containerID="614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.448267 4895 scope.go:117] "RemoveContainer" containerID="614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb" Dec 02 07:26:55 crc kubenswrapper[4895]: E1202 07:26:55.453933 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb\": container with ID starting with 614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb not found: ID does not exist" containerID="614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.453997 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb"} err="failed to get container status \"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb\": rpc error: code = NotFound desc = could not find container \"614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb\": container with ID starting with 614027919b01bfb6878f55f3f43287491f1c942103505d3c909172a2e34c9abb not found: ID does not exist" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.463402 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-446vz\" (UniqueName: \"kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz\") pod \"2d8db6b7-b02b-49af-90df-46c045e6664e\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.463584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config\") pod \"2d8db6b7-b02b-49af-90df-46c045e6664e\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464727 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config" (OuterVolumeSpecName: "config") pod "2d8db6b7-b02b-49af-90df-46c045e6664e" (UID: "2d8db6b7-b02b-49af-90df-46c045e6664e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464762 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert\") pod \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert\") pod \"2d8db6b7-b02b-49af-90df-46c045e6664e\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464844 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca\") pod \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464895 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca\") pod \"2d8db6b7-b02b-49af-90df-46c045e6664e\" (UID: \"2d8db6b7-b02b-49af-90df-46c045e6664e\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464953 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles\") pod \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.464986 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config\") pod \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.465017 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhg5m\" (UniqueName: \"kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m\") pod \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\" (UID: \"ba69ea60-ce66-4a59-8d92-a869ffdc3bf8\") " Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.465545 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.465572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d8db6b7-b02b-49af-90df-46c045e6664e" (UID: "2d8db6b7-b02b-49af-90df-46c045e6664e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.467499 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" (UID: "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.467579 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config" (OuterVolumeSpecName: "config") pod "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" (UID: "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.468481 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" (UID: "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.471324 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz" (OuterVolumeSpecName: "kube-api-access-446vz") pod "2d8db6b7-b02b-49af-90df-46c045e6664e" (UID: "2d8db6b7-b02b-49af-90df-46c045e6664e"). InnerVolumeSpecName "kube-api-access-446vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.471481 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d8db6b7-b02b-49af-90df-46c045e6664e" (UID: "2d8db6b7-b02b-49af-90df-46c045e6664e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.471556 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" (UID: "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.471844 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m" (OuterVolumeSpecName: "kube-api-access-hhg5m") pod "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" (UID: "ba69ea60-ce66-4a59-8d92-a869ffdc3bf8"). InnerVolumeSpecName "kube-api-access-hhg5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.565957 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566005 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhg5m\" (UniqueName: \"kubernetes.io/projected/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-kube-api-access-hhg5m\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566019 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-446vz\" (UniqueName: \"kubernetes.io/projected/2d8db6b7-b02b-49af-90df-46c045e6664e-kube-api-access-446vz\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566031 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566046 4895 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8db6b7-b02b-49af-90df-46c045e6664e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566058 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566069 4895 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d8db6b7-b02b-49af-90df-46c045e6664e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.566080 4895 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.719535 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.724205 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f7bf5bf79-2xct5"] Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.743661 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:55 crc kubenswrapper[4895]: I1202 07:26:55.750662 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67485fb985-x7mxq"] Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.370836 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm"] Dec 02 07:26:56 crc kubenswrapper[4895]: E1202 07:26:56.371625 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" containerName="controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.371657 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" containerName="controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: E1202 07:26:56.371685 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8db6b7-b02b-49af-90df-46c045e6664e" containerName="route-controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.371699 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8db6b7-b02b-49af-90df-46c045e6664e" containerName="route-controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.371906 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8db6b7-b02b-49af-90df-46c045e6664e" containerName="route-controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.371939 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" containerName="controller-manager" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.372619 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.374265 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.374962 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.375200 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.375267 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.375209 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.375347 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.377431 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm"] Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.378278 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.379436 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.379923 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.379965 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.380655 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.381217 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.381908 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.382887 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm"] Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.388112 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.388231 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm"] Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.477317 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-client-ca\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.477564 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8x8g\" (UniqueName: \"kubernetes.io/projected/bfdabd6b-9b00-461a-957b-5b22e6118c2c-kube-api-access-q8x8g\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.477842 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-config\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.477991 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdabd6b-9b00-461a-957b-5b22e6118c2c-serving-cert\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.578879 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8x8g\" (UniqueName: \"kubernetes.io/projected/bfdabd6b-9b00-461a-957b-5b22e6118c2c-kube-api-access-q8x8g\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlzz9\" (UniqueName: \"kubernetes.io/projected/844cd926-0e3a-481c-9686-d887653e4bf4-kube-api-access-nlzz9\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-proxy-ca-bundles\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579129 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-config\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-config\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579185 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-client-ca\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579213 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844cd926-0e3a-481c-9686-d887653e4bf4-serving-cert\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579314 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdabd6b-9b00-461a-957b-5b22e6118c2c-serving-cert\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.579381 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-client-ca\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.580656 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-client-ca\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.581938 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfdabd6b-9b00-461a-957b-5b22e6118c2c-config\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.591935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdabd6b-9b00-461a-957b-5b22e6118c2c-serving-cert\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.602430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8x8g\" (UniqueName: \"kubernetes.io/projected/bfdabd6b-9b00-461a-957b-5b22e6118c2c-kube-api-access-q8x8g\") pod \"route-controller-manager-5b5c48f497-fzhgm\" (UID: \"bfdabd6b-9b00-461a-957b-5b22e6118c2c\") " pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.681239 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlzz9\" (UniqueName: \"kubernetes.io/projected/844cd926-0e3a-481c-9686-d887653e4bf4-kube-api-access-nlzz9\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.681311 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-proxy-ca-bundles\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.681334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-config\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.681351 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-client-ca\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.681375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844cd926-0e3a-481c-9686-d887653e4bf4-serving-cert\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.682407 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-client-ca\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.682790 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-config\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.683632 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/844cd926-0e3a-481c-9686-d887653e4bf4-proxy-ca-bundles\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.686814 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/844cd926-0e3a-481c-9686-d887653e4bf4-serving-cert\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.699309 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlzz9\" (UniqueName: \"kubernetes.io/projected/844cd926-0e3a-481c-9686-d887653e4bf4-kube-api-access-nlzz9\") pod \"controller-manager-7fdcf999c5-tmrvm\" (UID: \"844cd926-0e3a-481c-9686-d887653e4bf4\") " pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.737435 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.743487 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:56 crc kubenswrapper[4895]: I1202 07:26:56.974570 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm"] Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.020934 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm"] Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.150264 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8db6b7-b02b-49af-90df-46c045e6664e" path="/var/lib/kubelet/pods/2d8db6b7-b02b-49af-90df-46c045e6664e/volumes" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.151677 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba69ea60-ce66-4a59-8d92-a869ffdc3bf8" path="/var/lib/kubelet/pods/ba69ea60-ce66-4a59-8d92-a869ffdc3bf8/volumes" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.444915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" event={"ID":"bfdabd6b-9b00-461a-957b-5b22e6118c2c","Type":"ContainerStarted","Data":"54671e6ee69c6f2c8f81230b3afd729f954ddb62abf6538ddeceadf5fed626cb"} Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.444976 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" event={"ID":"bfdabd6b-9b00-461a-957b-5b22e6118c2c","Type":"ContainerStarted","Data":"8911b94419b85cbe0e3b338abc4b506548b08040b328f9be9061fa11a53a7bfd"} Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.444997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.446886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" event={"ID":"844cd926-0e3a-481c-9686-d887653e4bf4","Type":"ContainerStarted","Data":"879e2534fadff4d988305dcb5b1409c2321564dd10e03e67b6b6498787b41654"} Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.446929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" event={"ID":"844cd926-0e3a-481c-9686-d887653e4bf4","Type":"ContainerStarted","Data":"4f609452266c2a0eb281b815666ab31cdf2c0214b5c4bc4f995854597fcbe227"} Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.447180 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.452200 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.465699 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" podStartSLOduration=3.465679953 podStartE2EDuration="3.465679953s" podCreationTimestamp="2025-12-02 07:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:57.463729364 +0000 UTC m=+228.634588997" watchObservedRunningTime="2025-12-02 07:26:57.465679953 +0000 UTC m=+228.636539566" Dec 02 07:26:57 crc kubenswrapper[4895]: I1202 07:26:57.484228 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fdcf999c5-tmrvm" podStartSLOduration=3.484209182 podStartE2EDuration="3.484209182s" podCreationTimestamp="2025-12-02 07:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:26:57.482082159 +0000 UTC m=+228.652941772" watchObservedRunningTime="2025-12-02 07:26:57.484209182 +0000 UTC m=+228.655068785" Dec 02 07:26:58 crc kubenswrapper[4895]: I1202 07:26:58.022366 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b5c48f497-fzhgm" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.167105 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.168425 4895 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.168569 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169446 4895 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169566 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169584 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169595 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169603 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169611 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169618 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169628 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169634 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169643 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169649 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169656 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169663 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169785 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169797 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169804 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169814 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169821 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169828 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.169917 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.169924 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.171083 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f" gracePeriod=15 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.171095 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b" gracePeriod=15 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.171179 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312" gracePeriod=15 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.171263 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa" gracePeriod=15 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.171425 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92" gracePeriod=15 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.223808 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.308421 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.308981 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309241 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309401 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.309489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411140 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411204 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411256 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411279 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411257 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411305 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411355 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411384 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411400 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411411 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411440 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.411443 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.473815 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.473897 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.473958 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.474993 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.475052 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874" gracePeriod=600 Dec 02 07:27:05 crc kubenswrapper[4895]: E1202 07:27:05.475852 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-wfcg7.187d5540b94c576e\": dial tcp 38.102.83.13:6443: connect: connection refused" event=< Dec 02 07:27:05 crc kubenswrapper[4895]: &Event{ObjectMeta:{machine-config-daemon-wfcg7.187d5540b94c576e openshift-machine-config-operator 29237 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-wfcg7,UID:0468d2d1-a975-45a6-af9f-47adc432fab0,APIVersion:v1,ResourceVersion:26745,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Dec 02 07:27:05 crc kubenswrapper[4895]: body: Dec 02 07:27:05 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:26:05 +0000 UTC,LastTimestamp:2025-12-02 07:27:05.473875974 +0000 UTC m=+236.644735587,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 07:27:05 crc kubenswrapper[4895]: > Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.495519 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.499507 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.500274 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312" exitCode=0 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.500302 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f" exitCode=0 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.500311 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92" exitCode=0 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.500319 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b" exitCode=2 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.500383 4895 scope.go:117] "RemoveContainer" containerID="a9998dffc20c5b0579e6214d10541b9702e9c4a6563c930a41e49b6853b832ba" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.504555 4895 generic.go:334] "Generic (PLEG): container finished" podID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" containerID="7ee6d11c2a73825aad67b7df1ba6718de0696089e8f3fe76d1f59bc77586cae8" exitCode=0 Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.504599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3","Type":"ContainerDied","Data":"7ee6d11c2a73825aad67b7df1ba6718de0696089e8f3fe76d1f59bc77586cae8"} Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.505256 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.505431 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.505600 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:05 crc kubenswrapper[4895]: I1202 07:27:05.521047 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:05 crc kubenswrapper[4895]: W1202 07:27:05.543753 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-807c066730770e82264d099e01d355751e17b81dcd20ccbb48f3f1927c24d418 WatchSource:0}: Error finding container 807c066730770e82264d099e01d355751e17b81dcd20ccbb48f3f1927c24d418: Status 404 returned error can't find the container with id 807c066730770e82264d099e01d355751e17b81dcd20ccbb48f3f1927c24d418 Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.149285 4895 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.150002 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.511318 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"148e2d9c65c4b9dfc5a1f01512204300666a49a5f533f72050314b72ac26a8b7"} Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.511378 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"807c066730770e82264d099e01d355751e17b81dcd20ccbb48f3f1927c24d418"} Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.512142 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.512629 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.512937 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.524965 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.528118 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874" exitCode=0 Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.528250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874"} Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.528425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb"} Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.529212 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.529366 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.529864 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.530266 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.909312 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.909995 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.910523 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.910881 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:06 crc kubenswrapper[4895]: I1202 07:27:06.911384 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037230 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir\") pod \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock\") pod \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037345 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access\") pod \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\" (UID: \"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037356 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" (UID: "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037366 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" (UID: "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037641 4895 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.037654 4895 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.043083 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" (UID: "8b0aeead-bbd7-4ba2-901f-2aa5be9899b3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.139093 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b0aeead-bbd7-4ba2-901f-2aa5be9899b3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.530524 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.532009 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.532807 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.533149 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.533687 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.534068 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b0aeead-bbd7-4ba2-901f-2aa5be9899b3","Type":"ContainerDied","Data":"ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc"} Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.534110 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7ec8280457ee73ed6aa7947c5ed0e9bdf337c0d48c2da5a3ef08b6abe2c8bc" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.534115 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.534264 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.537732 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.537860 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.538147 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.538525 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa" exitCode=0 Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.538620 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.538648 4895 scope.go:117] "RemoveContainer" containerID="b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.538702 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.539028 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.551675 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.551820 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.551885 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.551996 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.552081 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.552103 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.552191 4895 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.552204 4895 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.552216 4895 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.553189 4895 scope.go:117] "RemoveContainer" containerID="d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.566567 4895 scope.go:117] "RemoveContainer" containerID="33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.579103 4895 scope.go:117] "RemoveContainer" containerID="4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.590836 4895 scope.go:117] "RemoveContainer" containerID="40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.605213 4895 scope.go:117] "RemoveContainer" containerID="0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.621471 4895 scope.go:117] "RemoveContainer" containerID="b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.622118 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\": container with ID starting with b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312 not found: ID does not exist" containerID="b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.622209 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312"} err="failed to get container status \"b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\": rpc error: code = NotFound desc = could not find container \"b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312\": container with ID starting with b674e20b2199fd0083aed9cf001d27a72d31ca6c62cbe376c9510cc94f37d312 not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.622265 4895 scope.go:117] "RemoveContainer" containerID="d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.622821 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\": container with ID starting with d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f not found: ID does not exist" containerID="d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.622978 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f"} err="failed to get container status \"d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\": rpc error: code = NotFound desc = could not find container \"d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f\": container with ID starting with d318f71ad32dd4a410093b85741038e468d540edd30f112234a9d595bc6fd85f not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.623063 4895 scope.go:117] "RemoveContainer" containerID="33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.623445 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\": container with ID starting with 33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92 not found: ID does not exist" containerID="33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.623540 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92"} err="failed to get container status \"33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\": rpc error: code = NotFound desc = could not find container \"33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92\": container with ID starting with 33e300332628d3baf9041b52dc7b8cd9dab1e9c5da76572e86b787433fd5fb92 not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.623591 4895 scope.go:117] "RemoveContainer" containerID="4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.624227 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\": container with ID starting with 4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b not found: ID does not exist" containerID="4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.624277 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b"} err="failed to get container status \"4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\": rpc error: code = NotFound desc = could not find container \"4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b\": container with ID starting with 4455fec2a817e7266e152d3f4afbede483504d6066f1a0f7375bce282ae10f8b not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.624309 4895 scope.go:117] "RemoveContainer" containerID="40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.624618 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\": container with ID starting with 40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa not found: ID does not exist" containerID="40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.624730 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa"} err="failed to get container status \"40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\": rpc error: code = NotFound desc = could not find container \"40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa\": container with ID starting with 40e020a2b4caf39d1661cee9c6b0055eca86e0ae9b81d2da7485ca9ec92dd0aa not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.624834 4895 scope.go:117] "RemoveContainer" containerID="0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783" Dec 02 07:27:07 crc kubenswrapper[4895]: E1202 07:27:07.625228 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\": container with ID starting with 0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783 not found: ID does not exist" containerID="0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.625306 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783"} err="failed to get container status \"0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\": rpc error: code = NotFound desc = could not find container \"0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783\": container with ID starting with 0faa0995131a1f9cbf2aa0cf7651d510fad1bdaf7e32de51f8a14b70f161e783 not found: ID does not exist" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.865251 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.865996 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.867026 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:07 crc kubenswrapper[4895]: I1202 07:27:07.868610 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.053133 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.054933 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.055405 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.055867 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.056505 4895 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:08 crc kubenswrapper[4895]: I1202 07:27:08.056534 4895 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.056837 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.257613 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Dec 02 07:27:08 crc kubenswrapper[4895]: E1202 07:27:08.658941 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Dec 02 07:27:09 crc kubenswrapper[4895]: I1202 07:27:09.144506 4895 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:09 crc kubenswrapper[4895]: I1202 07:27:09.144840 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:09 crc kubenswrapper[4895]: I1202 07:27:09.145123 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:09 crc kubenswrapper[4895]: I1202 07:27:09.149172 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:09 crc kubenswrapper[4895]: I1202 07:27:09.153103 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 07:27:09 crc kubenswrapper[4895]: E1202 07:27:09.459925 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Dec 02 07:27:11 crc kubenswrapper[4895]: E1202 07:27:11.060902 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="3.2s" Dec 02 07:27:14 crc kubenswrapper[4895]: E1202 07:27:14.262733 4895 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="6.4s" Dec 02 07:27:14 crc kubenswrapper[4895]: E1202 07:27:14.958074 4895 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-wfcg7.187d5540b94c576e\": dial tcp 38.102.83.13:6443: connect: connection refused" event=< Dec 02 07:27:14 crc kubenswrapper[4895]: &Event{ObjectMeta:{machine-config-daemon-wfcg7.187d5540b94c576e openshift-machine-config-operator 29237 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-wfcg7,UID:0468d2d1-a975-45a6-af9f-47adc432fab0,APIVersion:v1,ResourceVersion:26745,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Dec 02 07:27:14 crc kubenswrapper[4895]: body: Dec 02 07:27:14 crc kubenswrapper[4895]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:26:05 +0000 UTC,LastTimestamp:2025-12-02 07:27:05.473875974 +0000 UTC m=+236.644735587,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 07:27:14 crc kubenswrapper[4895]: > Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.140786 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.141618 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.142241 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.144390 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.158488 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.158515 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:16 crc kubenswrapper[4895]: E1202 07:27:16.158960 4895 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.159693 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:16 crc kubenswrapper[4895]: W1202 07:27:16.188232 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e072b4a25f9c1b55ed4ab65af7da571e3ecc6bac7901ca4cc7887ea04bc8a9fd WatchSource:0}: Error finding container e072b4a25f9c1b55ed4ab65af7da571e3ecc6bac7901ca4cc7887ea04bc8a9fd: Status 404 returned error can't find the container with id e072b4a25f9c1b55ed4ab65af7da571e3ecc6bac7901ca4cc7887ea04bc8a9fd Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.597390 4895 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="18ff057cea668b338da79151f8755a81dcd60e45db3409c3c08bc7cf9b448d2a" exitCode=0 Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.597549 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"18ff057cea668b338da79151f8755a81dcd60e45db3409c3c08bc7cf9b448d2a"} Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.597767 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e072b4a25f9c1b55ed4ab65af7da571e3ecc6bac7901ca4cc7887ea04bc8a9fd"} Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.598091 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.598110 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:16 crc kubenswrapper[4895]: E1202 07:27:16.598507 4895 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.598654 4895 status_manager.go:851] "Failed to get status for pod" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-wfcg7\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.599119 4895 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:16 crc kubenswrapper[4895]: I1202 07:27:16.599349 4895 status_manager.go:851] "Failed to get status for pod" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 02 07:27:17 crc kubenswrapper[4895]: I1202 07:27:17.620119 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5942a2850731186a2cead0ea4d86c9455219498342bad3260497959ace0e992a"} Dec 02 07:27:17 crc kubenswrapper[4895]: I1202 07:27:17.620482 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"92d1425fe94af944eaeca427e6732ace6484f5430fe5eaf1f7eb2c7548d219d6"} Dec 02 07:27:17 crc kubenswrapper[4895]: I1202 07:27:17.620494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"597808c2bc019c4ff0dffac317cd421b636fe869a59f1bd5b31d847cef00c089"} Dec 02 07:27:17 crc kubenswrapper[4895]: I1202 07:27:17.620509 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1047df0579985f44680b37a452b8bd05bbe6926016bb3db67ac167d56d4f4869"} Dec 02 07:27:18 crc kubenswrapper[4895]: I1202 07:27:18.629445 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6472c1bbe910da6bf683b50509d855a05b8cec57ef53f211ef913fc31571bef4"} Dec 02 07:27:18 crc kubenswrapper[4895]: I1202 07:27:18.629810 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:18 crc kubenswrapper[4895]: I1202 07:27:18.629827 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:18 crc kubenswrapper[4895]: I1202 07:27:18.629902 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.574656 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": EOF" start-of-body= Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.575109 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": EOF" Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.638349 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.638410 4895 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d" exitCode=1 Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.638449 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d"} Dec 02 07:27:19 crc kubenswrapper[4895]: I1202 07:27:19.638960 4895 scope.go:117] "RemoveContainer" containerID="c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d" Dec 02 07:27:20 crc kubenswrapper[4895]: I1202 07:27:20.652348 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:27:20 crc kubenswrapper[4895]: I1202 07:27:20.652791 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f385715c8f9dcf5dcb847ddc305c75e19f61c324951fdf708efd4d091fcd6443"} Dec 02 07:27:21 crc kubenswrapper[4895]: I1202 07:27:21.160007 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:21 crc kubenswrapper[4895]: I1202 07:27:21.160112 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:21 crc kubenswrapper[4895]: I1202 07:27:21.169663 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:23 crc kubenswrapper[4895]: I1202 07:27:23.643162 4895 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:23 crc kubenswrapper[4895]: I1202 07:27:23.734533 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="568cb3df-2942-4518-8613-e14c45a355ac" Dec 02 07:27:24 crc kubenswrapper[4895]: I1202 07:27:24.674611 4895 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:24 crc kubenswrapper[4895]: I1202 07:27:24.674650 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="7b7fa1ad-7e38-4645-ab41-c7d395f5c10e" Dec 02 07:27:24 crc kubenswrapper[4895]: I1202 07:27:24.677958 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="568cb3df-2942-4518-8613-e14c45a355ac" Dec 02 07:27:26 crc kubenswrapper[4895]: I1202 07:27:26.697104 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:27:26 crc kubenswrapper[4895]: I1202 07:27:26.697725 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:27:26 crc kubenswrapper[4895]: I1202 07:27:26.697811 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:27:28 crc kubenswrapper[4895]: I1202 07:27:28.794105 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:27:29 crc kubenswrapper[4895]: I1202 07:27:29.953260 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 07:27:30 crc kubenswrapper[4895]: I1202 07:27:30.383344 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:27:30 crc kubenswrapper[4895]: I1202 07:27:30.450115 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 07:27:30 crc kubenswrapper[4895]: I1202 07:27:30.983534 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 07:27:31 crc kubenswrapper[4895]: I1202 07:27:31.096875 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 07:27:31 crc kubenswrapper[4895]: I1202 07:27:31.186436 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 07:27:31 crc kubenswrapper[4895]: I1202 07:27:31.210102 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 07:27:32 crc kubenswrapper[4895]: I1202 07:27:32.304300 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 07:27:32 crc kubenswrapper[4895]: I1202 07:27:32.565467 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:27:32 crc kubenswrapper[4895]: I1202 07:27:32.739392 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 07:27:32 crc kubenswrapper[4895]: I1202 07:27:32.792935 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 07:27:32 crc kubenswrapper[4895]: I1202 07:27:32.860190 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 07:27:33 crc kubenswrapper[4895]: I1202 07:27:33.388868 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 07:27:33 crc kubenswrapper[4895]: I1202 07:27:33.416376 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:27:33 crc kubenswrapper[4895]: I1202 07:27:33.509937 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 07:27:33 crc kubenswrapper[4895]: I1202 07:27:33.549248 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.079458 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.079769 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.082069 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.082112 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.082129 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.082270 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.115250 4895 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.117684 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=29.117646237 podStartE2EDuration="29.117646237s" podCreationTimestamp="2025-12-02 07:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:27:23.669374003 +0000 UTC m=+254.840233626" watchObservedRunningTime="2025-12-02 07:27:34.117646237 +0000 UTC m=+265.288505850" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.125025 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.125103 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.130486 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.132312 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.151366 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.151336268 podStartE2EDuration="11.151336268s" podCreationTimestamp="2025-12-02 07:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:27:34.145670634 +0000 UTC m=+265.316530277" watchObservedRunningTime="2025-12-02 07:27:34.151336268 +0000 UTC m=+265.322195891" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.253829 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.693817 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.745883 4895 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.746206 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://148e2d9c65c4b9dfc5a1f01512204300666a49a5f533f72050314b72ac26a8b7" gracePeriod=5 Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.801014 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.876976 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:27:34 crc kubenswrapper[4895]: I1202 07:27:34.912844 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:27:35 crc kubenswrapper[4895]: I1202 07:27:35.032237 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:27:35 crc kubenswrapper[4895]: I1202 07:27:35.181798 4895 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 07:27:35 crc kubenswrapper[4895]: I1202 07:27:35.557568 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.209706 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.212380 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.264307 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.697696 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.697891 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:27:36 crc kubenswrapper[4895]: I1202 07:27:36.707888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.670206 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.725780 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.740905 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.832491 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.838137 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 07:27:37 crc kubenswrapper[4895]: I1202 07:27:37.964067 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.058386 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.137566 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.328446 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.414701 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.499833 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.763630 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 07:27:38 crc kubenswrapper[4895]: I1202 07:27:38.946483 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.033705 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.112669 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.127711 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.132272 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.141016 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.180995 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.376076 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.394446 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.520955 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.545320 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.636192 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.701022 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.762837 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.825921 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.838148 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.858206 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.860558 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.887528 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.911178 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 07:27:39 crc kubenswrapper[4895]: I1202 07:27:39.973169 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.135558 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.137217 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.137280 4895 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="148e2d9c65c4b9dfc5a1f01512204300666a49a5f533f72050314b72ac26a8b7" exitCode=137 Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.226639 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.338980 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.339081 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383019 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383142 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383189 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383333 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383388 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383493 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383580 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.383998 4895 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.384041 4895 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.384062 4895 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.384084 4895 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.396280 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.485440 4895 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.601043 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.784241 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.793134 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.830399 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.865725 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.920763 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 07:27:40 crc kubenswrapper[4895]: I1202 07:27:40.982242 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.025719 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.040725 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.144946 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.145101 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.149315 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.149576 4895 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.156610 4895 scope.go:117] "RemoveContainer" containerID="148e2d9c65c4b9dfc5a1f01512204300666a49a5f533f72050314b72ac26a8b7" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.159919 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.159944 4895 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1f148dc8-a15c-469d-b40e-db07929af88a" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.163514 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.163550 4895 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1f148dc8-a15c-469d-b40e-db07929af88a" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.227208 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.277608 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.298122 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.300298 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.317175 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.430848 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.544720 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.545723 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.639337 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.867006 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 07:27:41 crc kubenswrapper[4895]: I1202 07:27:41.868338 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.064631 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.108893 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.217245 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.246443 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.263562 4895 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.289610 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.381035 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.424601 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.444185 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.484161 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.485199 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.601224 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.662265 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.699376 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.761396 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.815729 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.869147 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 07:27:42 crc kubenswrapper[4895]: I1202 07:27:42.897689 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.023549 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.080210 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.161952 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.197270 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.248719 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.259662 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.288027 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.378483 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.384890 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.532529 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.598730 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.638135 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.703706 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.796902 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 07:27:43 crc kubenswrapper[4895]: I1202 07:27:43.896821 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.050072 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.054517 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.065592 4895 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.108252 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.108734 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zbpzj" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="registry-server" containerID="cri-o://13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a" gracePeriod=30 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.118145 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.118503 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5q5hv" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="registry-server" containerID="cri-o://60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094" gracePeriod=30 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.122305 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.134392 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.138032 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.138569 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" containerID="cri-o://17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2" gracePeriod=30 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.160438 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.160864 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ptmfz" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="registry-server" containerID="cri-o://628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1" gracePeriod=30 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.173708 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.176157 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.176567 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wjh4k" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="registry-server" containerID="cri-o://15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68" gracePeriod=30 Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.204339 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.227216 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.230191 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.259808 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.260077 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.293974 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.295963 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.432958 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.437664 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.450983 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.490101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.498019 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.506718 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.563920 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.580233 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.647023 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.652911 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.656205 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.660503 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.666532 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.673372 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.676468 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.684843 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.726921 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.736572 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.743621 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7md7f\" (UniqueName: \"kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f\") pod \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.743853 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics\") pod \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.743995 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2v96\" (UniqueName: \"kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96\") pod \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.744159 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities\") pod \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.744299 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities\") pod \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.744447 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca\") pod \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\" (UID: \"4a9d5b86-ddba-433a-91c3-efe2043f66e3\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745093 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities" (OuterVolumeSpecName: "utilities") pod "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" (UID: "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745113 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities" (OuterVolumeSpecName: "utilities") pod "d0fded14-dfbe-41aa-af93-f68c62a1aca1" (UID: "d0fded14-dfbe-41aa-af93-f68c62a1aca1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745227 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4a9d5b86-ddba-433a-91c3-efe2043f66e3" (UID: "4a9d5b86-ddba-433a-91c3-efe2043f66e3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content\") pod \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745450 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kglmv\" (UniqueName: \"kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv\") pod \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\" (UID: \"d0fded14-dfbe-41aa-af93-f68c62a1aca1\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.745532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content\") pod \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\" (UID: \"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.746182 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.746203 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.746214 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.751379 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv" (OuterVolumeSpecName: "kube-api-access-kglmv") pod "d0fded14-dfbe-41aa-af93-f68c62a1aca1" (UID: "d0fded14-dfbe-41aa-af93-f68c62a1aca1"). InnerVolumeSpecName "kube-api-access-kglmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.751906 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96" (OuterVolumeSpecName: "kube-api-access-r2v96") pod "4a9d5b86-ddba-433a-91c3-efe2043f66e3" (UID: "4a9d5b86-ddba-433a-91c3-efe2043f66e3"). InnerVolumeSpecName "kube-api-access-r2v96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.752817 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f" (OuterVolumeSpecName: "kube-api-access-7md7f") pod "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" (UID: "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19"). InnerVolumeSpecName "kube-api-access-7md7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.757136 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4a9d5b86-ddba-433a-91c3-efe2043f66e3" (UID: "4a9d5b86-ddba-433a-91c3-efe2043f66e3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.806028 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" (UID: "2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.807185 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0fded14-dfbe-41aa-af93-f68c62a1aca1" (UID: "d0fded14-dfbe-41aa-af93-f68c62a1aca1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847149 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wcp\" (UniqueName: \"kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp\") pod \"24eca501-8830-4bc6-8a5e-e00d227e841c\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847293 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content\") pod \"24eca501-8830-4bc6-8a5e-e00d227e841c\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847365 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities\") pod \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847406 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content\") pod \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847494 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfjgv\" (UniqueName: \"kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv\") pod \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\" (UID: \"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847528 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities\") pod \"24eca501-8830-4bc6-8a5e-e00d227e841c\" (UID: \"24eca501-8830-4bc6-8a5e-e00d227e841c\") " Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847864 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847937 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7md7f\" (UniqueName: \"kubernetes.io/projected/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19-kube-api-access-7md7f\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847962 4895 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4a9d5b86-ddba-433a-91c3-efe2043f66e3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.847981 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2v96\" (UniqueName: \"kubernetes.io/projected/4a9d5b86-ddba-433a-91c3-efe2043f66e3-kube-api-access-r2v96\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.848000 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0fded14-dfbe-41aa-af93-f68c62a1aca1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.848018 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kglmv\" (UniqueName: \"kubernetes.io/projected/d0fded14-dfbe-41aa-af93-f68c62a1aca1-kube-api-access-kglmv\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.848183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities" (OuterVolumeSpecName: "utilities") pod "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" (UID: "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.848810 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities" (OuterVolumeSpecName: "utilities") pod "24eca501-8830-4bc6-8a5e-e00d227e841c" (UID: "24eca501-8830-4bc6-8a5e-e00d227e841c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.850539 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp" (OuterVolumeSpecName: "kube-api-access-k5wcp") pod "24eca501-8830-4bc6-8a5e-e00d227e841c" (UID: "24eca501-8830-4bc6-8a5e-e00d227e841c"). InnerVolumeSpecName "kube-api-access-k5wcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.852421 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv" (OuterVolumeSpecName: "kube-api-access-lfjgv") pod "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" (UID: "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364"). InnerVolumeSpecName "kube-api-access-lfjgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.866087 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24eca501-8830-4bc6-8a5e-e00d227e841c" (UID: "24eca501-8830-4bc6-8a5e-e00d227e841c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.902497 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.944625 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" (UID: "f4ffa89c-1b2e-4dd4-afa5-f34cd8260364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949000 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949027 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfjgv\" (UniqueName: \"kubernetes.io/projected/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-kube-api-access-lfjgv\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949042 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949051 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5wcp\" (UniqueName: \"kubernetes.io/projected/24eca501-8830-4bc6-8a5e-e00d227e841c-kube-api-access-k5wcp\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949059 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24eca501-8830-4bc6-8a5e-e00d227e841c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.949067 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:27:44 crc kubenswrapper[4895]: I1202 07:27:44.988140 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.134320 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.134692 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.176027 4895 generic.go:334] "Generic (PLEG): container finished" podID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerID="17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2" exitCode=0 Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.176084 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" event={"ID":"4a9d5b86-ddba-433a-91c3-efe2043f66e3","Type":"ContainerDied","Data":"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.176154 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" event={"ID":"4a9d5b86-ddba-433a-91c3-efe2043f66e3","Type":"ContainerDied","Data":"0f02bb18c7138d49b9d51d5ef29c16ef0ce4011c6071536ad55b8702620860ff"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.176179 4895 scope.go:117] "RemoveContainer" containerID="17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.176349 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkq6j" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.181491 4895 generic.go:334] "Generic (PLEG): container finished" podID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerID="13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a" exitCode=0 Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.181559 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerDied","Data":"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.181589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zbpzj" event={"ID":"2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19","Type":"ContainerDied","Data":"df811010396c482c331a4259a468db1921ebdbaebe8d78610bc1439a28276495"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.181672 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zbpzj" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.182563 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.189751 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerID="60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094" exitCode=0 Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.189823 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q5hv" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.189900 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerDied","Data":"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.189950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q5hv" event={"ID":"d0fded14-dfbe-41aa-af93-f68c62a1aca1","Type":"ContainerDied","Data":"198c250052edd6dd7f2c8faa5b781b549698d366753f6e900814a63f59d71f20"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.195124 4895 generic.go:334] "Generic (PLEG): container finished" podID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerID="628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1" exitCode=0 Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.195317 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ptmfz" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.197199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerDied","Data":"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.198776 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ptmfz" event={"ID":"24eca501-8830-4bc6-8a5e-e00d227e841c","Type":"ContainerDied","Data":"894dd4fb1df051d5da966aa1c44069c1e2e167bb80e259dd72f67ea0c45472a0"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.202695 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerID="15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68" exitCode=0 Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.202778 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjh4k" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.202861 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerDied","Data":"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.202919 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjh4k" event={"ID":"f4ffa89c-1b2e-4dd4-afa5-f34cd8260364","Type":"ContainerDied","Data":"9b66a747cc6eed02c70aae485088aad215a0743f0a1e9d014c2b7880dba8944b"} Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.212839 4895 scope.go:117] "RemoveContainer" containerID="17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.213450 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2\": container with ID starting with 17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2 not found: ID does not exist" containerID="17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.213602 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2"} err="failed to get container status \"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2\": rpc error: code = NotFound desc = could not find container \"17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2\": container with ID starting with 17cec86d4a6d9e0a4d2baa9caf84abb93e703183176413e7ec47931e8c2dcff2 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.213727 4895 scope.go:117] "RemoveContainer" containerID="13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.229096 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.231430 4895 scope.go:117] "RemoveContainer" containerID="cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.249480 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zbpzj"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.258900 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.269179 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5q5hv"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.275941 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.279242 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkq6j"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.282578 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.285313 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wjh4k"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.288039 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.289768 4895 scope.go:117] "RemoveContainer" containerID="19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.290893 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ptmfz"] Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.310262 4895 scope.go:117] "RemoveContainer" containerID="13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.313109 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a\": container with ID starting with 13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a not found: ID does not exist" containerID="13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.313174 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a"} err="failed to get container status \"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a\": rpc error: code = NotFound desc = could not find container \"13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a\": container with ID starting with 13b77a9b2065f78c86563dc137d56ac1d758ce7264367e8bfbeec2badeb6cf2a not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.313214 4895 scope.go:117] "RemoveContainer" containerID="cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.313900 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970\": container with ID starting with cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970 not found: ID does not exist" containerID="cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.314036 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970"} err="failed to get container status \"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970\": rpc error: code = NotFound desc = could not find container \"cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970\": container with ID starting with cf96ed9212414dc047f571e7be05ddafc5457efe1626035683b55e4d140bf970 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.314272 4895 scope.go:117] "RemoveContainer" containerID="19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.314928 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836\": container with ID starting with 19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836 not found: ID does not exist" containerID="19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.314972 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836"} err="failed to get container status \"19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836\": rpc error: code = NotFound desc = could not find container \"19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836\": container with ID starting with 19109259329f04661bf71f534f1b21f8929c6d5bd7773dab3629044eda9e7836 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.314990 4895 scope.go:117] "RemoveContainer" containerID="60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.322587 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.340226 4895 scope.go:117] "RemoveContainer" containerID="7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.364807 4895 scope.go:117] "RemoveContainer" containerID="6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.379766 4895 scope.go:117] "RemoveContainer" containerID="60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.380443 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094\": container with ID starting with 60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094 not found: ID does not exist" containerID="60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.380490 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094"} err="failed to get container status \"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094\": rpc error: code = NotFound desc = could not find container \"60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094\": container with ID starting with 60787ac5efa236959c3cc0c8a4a1fa0d521e29fe1c5864aed373319ee7584094 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.380526 4895 scope.go:117] "RemoveContainer" containerID="7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.380984 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a\": container with ID starting with 7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a not found: ID does not exist" containerID="7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.381024 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a"} err="failed to get container status \"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a\": rpc error: code = NotFound desc = could not find container \"7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a\": container with ID starting with 7f393639c2e2db9c602decafa991581e44b8ff62dd2f69ead54029cff296574a not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.381058 4895 scope.go:117] "RemoveContainer" containerID="6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.381542 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718\": container with ID starting with 6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718 not found: ID does not exist" containerID="6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.381610 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718"} err="failed to get container status \"6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718\": rpc error: code = NotFound desc = could not find container \"6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718\": container with ID starting with 6e2f6e38ca8373d214cb0494d9abf782e17f98428bc2189208b7bfdba85ad718 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.381659 4895 scope.go:117] "RemoveContainer" containerID="628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.398044 4895 scope.go:117] "RemoveContainer" containerID="915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.413540 4895 scope.go:117] "RemoveContainer" containerID="97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.422339 4895 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.426714 4895 scope.go:117] "RemoveContainer" containerID="628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.427232 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1\": container with ID starting with 628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1 not found: ID does not exist" containerID="628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.427275 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1"} err="failed to get container status \"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1\": rpc error: code = NotFound desc = could not find container \"628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1\": container with ID starting with 628851b9b305622aac4e980541e5a271ee009414ae0227ef0a5ecc9cc3b60eb1 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.427308 4895 scope.go:117] "RemoveContainer" containerID="915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.427685 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168\": container with ID starting with 915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168 not found: ID does not exist" containerID="915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.427721 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168"} err="failed to get container status \"915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168\": rpc error: code = NotFound desc = could not find container \"915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168\": container with ID starting with 915f910514528d8352b5c58355a27a4acffed076f57f0009970e551977520168 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.427786 4895 scope.go:117] "RemoveContainer" containerID="97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.428109 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171\": container with ID starting with 97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171 not found: ID does not exist" containerID="97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.428161 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171"} err="failed to get container status \"97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171\": rpc error: code = NotFound desc = could not find container \"97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171\": container with ID starting with 97a525d686bb084acb0f3699a2c52005b37b7854236377ccf6903454f9ca0171 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.428213 4895 scope.go:117] "RemoveContainer" containerID="15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.430959 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.440644 4895 scope.go:117] "RemoveContainer" containerID="eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.455911 4895 scope.go:117] "RemoveContainer" containerID="cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.469003 4895 scope.go:117] "RemoveContainer" containerID="15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.469759 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68\": container with ID starting with 15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68 not found: ID does not exist" containerID="15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.469812 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68"} err="failed to get container status \"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68\": rpc error: code = NotFound desc = could not find container \"15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68\": container with ID starting with 15907fae904024bdd546782dd70a88d8ecd4cc0236cd2411fabeff8b259cba68 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.469848 4895 scope.go:117] "RemoveContainer" containerID="eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.470308 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa\": container with ID starting with eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa not found: ID does not exist" containerID="eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.470381 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa"} err="failed to get container status \"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa\": rpc error: code = NotFound desc = could not find container \"eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa\": container with ID starting with eeee1a60813e592a855df4543ff5bc9c06f8654ab5bbc449702a7a6a46bc63aa not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.470428 4895 scope.go:117] "RemoveContainer" containerID="cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9" Dec 02 07:27:45 crc kubenswrapper[4895]: E1202 07:27:45.470958 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9\": container with ID starting with cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9 not found: ID does not exist" containerID="cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.470998 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9"} err="failed to get container status \"cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9\": rpc error: code = NotFound desc = could not find container \"cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9\": container with ID starting with cef28fb9c9eb8d7f1912ec40b25023e2a8baeec3dca0bc44c899626bfb86a1a9 not found: ID does not exist" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.510685 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.583583 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.596709 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.675870 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.688856 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 07:27:45 crc kubenswrapper[4895]: I1202 07:27:45.955501 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.062559 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.129360 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.205821 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.209672 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.310388 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.353362 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.395880 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.485116 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.504081 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.698864 4895 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.698955 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.699030 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.700378 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"f385715c8f9dcf5dcb847ddc305c75e19f61c324951fdf708efd4d091fcd6443"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.700618 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://f385715c8f9dcf5dcb847ddc305c75e19f61c324951fdf708efd4d091fcd6443" gracePeriod=30 Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.716583 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.747651 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.920010 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.949678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 07:27:46 crc kubenswrapper[4895]: I1202 07:27:46.975624 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.021493 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.054329 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.104637 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.111617 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.117315 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.164414 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" path="/var/lib/kubelet/pods/24eca501-8830-4bc6-8a5e-e00d227e841c/volumes" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.166264 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" path="/var/lib/kubelet/pods/2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19/volumes" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.167837 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" path="/var/lib/kubelet/pods/4a9d5b86-ddba-433a-91c3-efe2043f66e3/volumes" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.169965 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" path="/var/lib/kubelet/pods/d0fded14-dfbe-41aa-af93-f68c62a1aca1/volumes" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.170195 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.171152 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" path="/var/lib/kubelet/pods/f4ffa89c-1b2e-4dd4-afa5-f34cd8260364/volumes" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.278250 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.297349 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.357876 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.359834 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.437069 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.532878 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.609002 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.854663 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.897570 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 07:27:47 crc kubenswrapper[4895]: I1202 07:27:47.997191 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.034373 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.043136 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.058552 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.150632 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.189734 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.209528 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.223059 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.240880 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.274187 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.356837 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.427812 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.701277 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.834386 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.954510 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 07:27:48 crc kubenswrapper[4895]: I1202 07:27:48.990698 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.004633 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.181864 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.368899 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.390069 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.406604 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.473399 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.522129 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.744258 4895 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.785001 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 07:27:49 crc kubenswrapper[4895]: I1202 07:27:49.994021 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.079614 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.080390 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.250351 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.358682 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.361324 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.401317 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.461048 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.565065 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.653451 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.773800 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.809449 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.814654 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.831855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.849634 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 07:27:50 crc kubenswrapper[4895]: I1202 07:27:50.889792 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.218951 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.220865 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.310493 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.372123 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.388671 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.594667 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.614471 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.858050 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 07:27:51 crc kubenswrapper[4895]: I1202 07:27:51.912078 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:27:52 crc kubenswrapper[4895]: I1202 07:27:52.092282 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 07:27:52 crc kubenswrapper[4895]: I1202 07:27:52.258994 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 07:27:53 crc kubenswrapper[4895]: I1202 07:27:53.388100 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.425688 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.428997 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.429072 4895 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f385715c8f9dcf5dcb847ddc305c75e19f61c324951fdf708efd4d091fcd6443" exitCode=137 Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.429128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f385715c8f9dcf5dcb847ddc305c75e19f61c324951fdf708efd4d091fcd6443"} Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.429182 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"456319f7ee489c411dbbe0f252f3be05d5ef72e2f8443036ca1f09dc5509cda9"} Dec 02 07:28:17 crc kubenswrapper[4895]: I1202 07:28:17.429219 4895 scope.go:117] "RemoveContainer" containerID="c6c3baa5b3772a8fc64774e28c3f389f114723e7efa62b934140a4fed8b6a40d" Dec 02 07:28:18 crc kubenswrapper[4895]: I1202 07:28:18.438511 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 07:28:18 crc kubenswrapper[4895]: I1202 07:28:18.795098 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:28:26 crc kubenswrapper[4895]: I1202 07:28:26.697464 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:28:26 crc kubenswrapper[4895]: I1202 07:28:26.705621 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:28:27 crc kubenswrapper[4895]: I1202 07:28:27.502953 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.107720 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xzs6v"] Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109026 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109052 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109074 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109088 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109106 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109119 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109137 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109151 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109163 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" containerName="installer" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109176 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" containerName="installer" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109194 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109207 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109223 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109236 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109255 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109267 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109293 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109310 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109330 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109343 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109358 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109374 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="extract-utilities" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109394 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109408 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109423 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109436 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109454 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109467 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: E1202 07:28:33.109487 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109500 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="extract-content" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109661 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9d5b86-ddba-433a-91c3-efe2043f66e3" containerName="marketplace-operator" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109678 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="24eca501-8830-4bc6-8a5e-e00d227e841c" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109695 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0aeead-bbd7-4ba2-901f-2aa5be9899b3" containerName="installer" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109713 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ad8c7c3-c1ac-498b-b1b5-76f1db5c1c19" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109767 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109782 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ffa89c-1b2e-4dd4-afa5-f34cd8260364" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.109801 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fded14-dfbe-41aa-af93-f68c62a1aca1" containerName="registry-server" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.111173 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.113808 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.114294 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.116161 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.125411 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzs6v"] Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.297962 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c6tgt"] Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.299378 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.303117 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.303583 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-catalog-content\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.303829 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-utilities\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.303970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-catalog-content\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.304482 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-utilities\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.304554 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwwr9\" (UniqueName: \"kubernetes.io/projected/a20409dc-2f9e-4c3b-b83e-11d31404503a-kube-api-access-dwwr9\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.304823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6m2t\" (UniqueName: \"kubernetes.io/projected/b60cfc9e-fdbe-4373-9c07-9db6a265b945-kube-api-access-h6m2t\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.307890 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6tgt"] Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407088 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-utilities\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407161 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwwr9\" (UniqueName: \"kubernetes.io/projected/a20409dc-2f9e-4c3b-b83e-11d31404503a-kube-api-access-dwwr9\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407211 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6m2t\" (UniqueName: \"kubernetes.io/projected/b60cfc9e-fdbe-4373-9c07-9db6a265b945-kube-api-access-h6m2t\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-catalog-content\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407303 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-utilities\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407327 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-catalog-content\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407836 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-utilities\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.407873 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20409dc-2f9e-4c3b-b83e-11d31404503a-catalog-content\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.408272 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-utilities\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.408264 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60cfc9e-fdbe-4373-9c07-9db6a265b945-catalog-content\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.440764 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6m2t\" (UniqueName: \"kubernetes.io/projected/b60cfc9e-fdbe-4373-9c07-9db6a265b945-kube-api-access-h6m2t\") pod \"redhat-marketplace-xzs6v\" (UID: \"b60cfc9e-fdbe-4373-9c07-9db6a265b945\") " pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.446641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwwr9\" (UniqueName: \"kubernetes.io/projected/a20409dc-2f9e-4c3b-b83e-11d31404503a-kube-api-access-dwwr9\") pod \"redhat-operators-c6tgt\" (UID: \"a20409dc-2f9e-4c3b-b83e-11d31404503a\") " pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.694147 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:33 crc kubenswrapper[4895]: I1202 07:28:33.740964 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.018472 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xzs6v"] Dec 02 07:28:34 crc kubenswrapper[4895]: W1202 07:28:34.028342 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60cfc9e_fdbe_4373_9c07_9db6a265b945.slice/crio-f6fdd98e6bdc8304ee47ec3e87618ba09d01f5c45dfb218d8f09fcf594b3a1b8 WatchSource:0}: Error finding container f6fdd98e6bdc8304ee47ec3e87618ba09d01f5c45dfb218d8f09fcf594b3a1b8: Status 404 returned error can't find the container with id f6fdd98e6bdc8304ee47ec3e87618ba09d01f5c45dfb218d8f09fcf594b3a1b8 Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.161853 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6tgt"] Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.558001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzs6v" event={"ID":"b60cfc9e-fdbe-4373-9c07-9db6a265b945","Type":"ContainerStarted","Data":"f6fdd98e6bdc8304ee47ec3e87618ba09d01f5c45dfb218d8f09fcf594b3a1b8"} Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.559148 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6tgt" event={"ID":"a20409dc-2f9e-4c3b-b83e-11d31404503a","Type":"ContainerStarted","Data":"c6017d64104c769f9f67b38987b66c130cd5037386dff03ce30d7e0db987e7ec"} Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.904542 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.906606 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.913213 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.920332 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.935274 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.935332 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l57k\" (UniqueName: \"kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:34 crc kubenswrapper[4895]: I1202 07:28:34.935353 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.037127 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.037425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l57k\" (UniqueName: \"kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.037538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.037975 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.038103 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.067400 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l57k\" (UniqueName: \"kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k\") pod \"community-operators-nd26q\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.291244 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.567105 4895 generic.go:334] "Generic (PLEG): container finished" podID="a20409dc-2f9e-4c3b-b83e-11d31404503a" containerID="38ed9087119ecfa2354ff2621bfed0898a9c156ea8b4446e7e1a980f4b06c5af" exitCode=0 Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.567240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6tgt" event={"ID":"a20409dc-2f9e-4c3b-b83e-11d31404503a","Type":"ContainerDied","Data":"38ed9087119ecfa2354ff2621bfed0898a9c156ea8b4446e7e1a980f4b06c5af"} Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.572983 4895 generic.go:334] "Generic (PLEG): container finished" podID="b60cfc9e-fdbe-4373-9c07-9db6a265b945" containerID="bd86ce10d71d1109f4ed9cdcd3899438a58b41c62fb2ad393a607c9fc87c9581" exitCode=0 Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.573062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzs6v" event={"ID":"b60cfc9e-fdbe-4373-9c07-9db6a265b945","Type":"ContainerDied","Data":"bd86ce10d71d1109f4ed9cdcd3899438a58b41c62fb2ad393a607c9fc87c9581"} Dec 02 07:28:35 crc kubenswrapper[4895]: I1202 07:28:35.718409 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.306498 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.308399 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.311357 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.321664 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.356285 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.356446 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.356503 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85h5p\" (UniqueName: \"kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.458043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.458119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85h5p\" (UniqueName: \"kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.458155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.458554 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.458564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.489455 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85h5p\" (UniqueName: \"kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p\") pod \"certified-operators-jcdj8\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.583774 4895 generic.go:334] "Generic (PLEG): container finished" podID="b60cfc9e-fdbe-4373-9c07-9db6a265b945" containerID="d45689d91717bbab7e3b44fdef642a68ce0ad4f648629f368dcf201b99766165" exitCode=0 Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.583845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzs6v" event={"ID":"b60cfc9e-fdbe-4373-9c07-9db6a265b945","Type":"ContainerDied","Data":"d45689d91717bbab7e3b44fdef642a68ce0ad4f648629f368dcf201b99766165"} Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.587695 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6tgt" event={"ID":"a20409dc-2f9e-4c3b-b83e-11d31404503a","Type":"ContainerStarted","Data":"5caa2d4632bb39d29fd0eaf5b3c86ad4876d590aaf7124d94526766aa2092575"} Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.590186 4895 generic.go:334] "Generic (PLEG): container finished" podID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerID="21a9cbd4f72db81b60a7d8a329b43e15d14442c4e102fbff1243abbaa330b9d2" exitCode=0 Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.590244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerDied","Data":"21a9cbd4f72db81b60a7d8a329b43e15d14442c4e102fbff1243abbaa330b9d2"} Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.590276 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerStarted","Data":"b1c43e63f27f333480d2d928e12a2e20da8e2892c48a1079b04facebc4bc352c"} Dec 02 07:28:36 crc kubenswrapper[4895]: I1202 07:28:36.671129 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.157619 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.504774 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qq7dd"] Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.507288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.517942 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.521612 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.527698 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.535151 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qq7dd"] Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.586105 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cm47r"] Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.587583 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.599926 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cm47r"] Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.615434 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xzs6v" event={"ID":"b60cfc9e-fdbe-4373-9c07-9db6a265b945","Type":"ContainerStarted","Data":"7afac7efd142d7584458cfa4454afc589acab9d78a0aca07cf4a91e52903ea4b"} Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.618021 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerID="9f45aa1d67b4452b880db8b0f42a0c39c5451c04383dc5c4c6c746602c1175d4" exitCode=0 Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.618095 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerDied","Data":"9f45aa1d67b4452b880db8b0f42a0c39c5451c04383dc5c4c6c746602c1175d4"} Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.618113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerStarted","Data":"c1c5a83db58e2b70841386dc7c60f39463b82ae8808d54e46d5d36f29093f12a"} Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.622275 4895 generic.go:334] "Generic (PLEG): container finished" podID="a20409dc-2f9e-4c3b-b83e-11d31404503a" containerID="5caa2d4632bb39d29fd0eaf5b3c86ad4876d590aaf7124d94526766aa2092575" exitCode=0 Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.622378 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6tgt" event={"ID":"a20409dc-2f9e-4c3b-b83e-11d31404503a","Type":"ContainerDied","Data":"5caa2d4632bb39d29fd0eaf5b3c86ad4876d590aaf7124d94526766aa2092575"} Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.630177 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerStarted","Data":"530922f89c90faccae2745204ad33c4cb8c17d8da1a820532bfc63d8ad5e7fc7"} Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.679355 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.679438 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xc7\" (UniqueName: \"kubernetes.io/projected/a5fe9776-f107-4203-9298-a7a94665cdb4-kube-api-access-k2xc7\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.679502 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.702786 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xzs6v" podStartSLOduration=3.285762191 podStartE2EDuration="4.70276317s" podCreationTimestamp="2025-12-02 07:28:33 +0000 UTC" firstStartedPulling="2025-12-02 07:28:35.575651002 +0000 UTC m=+326.746510615" lastFinishedPulling="2025-12-02 07:28:36.992651971 +0000 UTC m=+328.163511594" observedRunningTime="2025-12-02 07:28:37.700168318 +0000 UTC m=+328.871027961" watchObservedRunningTime="2025-12-02 07:28:37.70276317 +0000 UTC m=+328.873622783" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbp2\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-kube-api-access-bzbp2\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xc7\" (UniqueName: \"kubernetes.io/projected/a5fe9776-f107-4203-9298-a7a94665cdb4-kube-api-access-k2xc7\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-certificates\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781369 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1212316e-b8c7-4ced-b492-9a5bf84ee942-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781410 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781455 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-trusted-ca\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781483 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781512 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1212316e-b8c7-4ced-b492-9a5bf84ee942-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781538 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-bound-sa-token\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.781582 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-tls\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.783401 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.791940 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a5fe9776-f107-4203-9298-a7a94665cdb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.799081 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2xc7\" (UniqueName: \"kubernetes.io/projected/a5fe9776-f107-4203-9298-a7a94665cdb4-kube-api-access-k2xc7\") pod \"marketplace-operator-79b997595-qq7dd\" (UID: \"a5fe9776-f107-4203-9298-a7a94665cdb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.813414 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.840533 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.883426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-tls\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.883938 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbp2\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-kube-api-access-bzbp2\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.883979 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-certificates\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.884009 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1212316e-b8c7-4ced-b492-9a5bf84ee942-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.884050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-trusted-ca\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.884073 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1212316e-b8c7-4ced-b492-9a5bf84ee942-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.884096 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-bound-sa-token\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.885800 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-certificates\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.886094 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1212316e-b8c7-4ced-b492-9a5bf84ee942-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.887474 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1212316e-b8c7-4ced-b492-9a5bf84ee942-trusted-ca\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.890124 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-registry-tls\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.890275 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1212316e-b8c7-4ced-b492-9a5bf84ee942-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.902489 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbp2\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-kube-api-access-bzbp2\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.902501 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1212316e-b8c7-4ced-b492-9a5bf84ee942-bound-sa-token\") pod \"image-registry-66df7c8f76-cm47r\" (UID: \"1212316e-b8c7-4ced-b492-9a5bf84ee942\") " pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:37 crc kubenswrapper[4895]: I1202 07:28:37.917047 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.079681 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qq7dd"] Dec 02 07:28:38 crc kubenswrapper[4895]: W1202 07:28:38.090326 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5fe9776_f107_4203_9298_a7a94665cdb4.slice/crio-e560325c0151d4d677fb2cc4e18b1182b3c0886788430765091a1692cf170b3c WatchSource:0}: Error finding container e560325c0151d4d677fb2cc4e18b1182b3c0886788430765091a1692cf170b3c: Status 404 returned error can't find the container with id e560325c0151d4d677fb2cc4e18b1182b3c0886788430765091a1692cf170b3c Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.379545 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cm47r"] Dec 02 07:28:38 crc kubenswrapper[4895]: W1202 07:28:38.386018 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1212316e_b8c7_4ced_b492_9a5bf84ee942.slice/crio-f4c43a60108ecb0c62e486efdc7352f2113ca83850401af2a836e57e8c0ae759 WatchSource:0}: Error finding container f4c43a60108ecb0c62e486efdc7352f2113ca83850401af2a836e57e8c0ae759: Status 404 returned error can't find the container with id f4c43a60108ecb0c62e486efdc7352f2113ca83850401af2a836e57e8c0ae759 Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.637125 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerStarted","Data":"1b060752968384cf6255b356cbdd446b2ab2551d42b0ba68c2077ef36d86e936"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.639913 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" event={"ID":"1212316e-b8c7-4ced-b492-9a5bf84ee942","Type":"ContainerStarted","Data":"1b2dac43b878fb54e31b11f5e7ec04ceb0c323b36e2c01a402aa4ceaa7bf84e3"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.639965 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" event={"ID":"1212316e-b8c7-4ced-b492-9a5bf84ee942","Type":"ContainerStarted","Data":"f4c43a60108ecb0c62e486efdc7352f2113ca83850401af2a836e57e8c0ae759"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.640075 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.641671 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" event={"ID":"a5fe9776-f107-4203-9298-a7a94665cdb4","Type":"ContainerStarted","Data":"51c55881a6375d988e0b7367503c750da6ecd2b8ba46962f6be5c4c3858d907a"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.641727 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" event={"ID":"a5fe9776-f107-4203-9298-a7a94665cdb4","Type":"ContainerStarted","Data":"e560325c0151d4d677fb2cc4e18b1182b3c0886788430765091a1692cf170b3c"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.641762 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.643291 4895 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qq7dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.643347 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" podUID="a5fe9776-f107-4203-9298-a7a94665cdb4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.645229 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6tgt" event={"ID":"a20409dc-2f9e-4c3b-b83e-11d31404503a","Type":"ContainerStarted","Data":"523da89c596a086b26ac579bc133880adcabf19c085c905bbe9a857044de46d3"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.646716 4895 generic.go:334] "Generic (PLEG): container finished" podID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerID="530922f89c90faccae2745204ad33c4cb8c17d8da1a820532bfc63d8ad5e7fc7" exitCode=0 Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.646763 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerDied","Data":"530922f89c90faccae2745204ad33c4cb8c17d8da1a820532bfc63d8ad5e7fc7"} Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.695127 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" podStartSLOduration=1.695095611 podStartE2EDuration="1.695095611s" podCreationTimestamp="2025-12-02 07:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:28:38.693138909 +0000 UTC m=+329.863998542" watchObservedRunningTime="2025-12-02 07:28:38.695095611 +0000 UTC m=+329.865955224" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.740899 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c6tgt" podStartSLOduration=2.920014976 podStartE2EDuration="5.740872844s" podCreationTimestamp="2025-12-02 07:28:33 +0000 UTC" firstStartedPulling="2025-12-02 07:28:35.570057631 +0000 UTC m=+326.740917254" lastFinishedPulling="2025-12-02 07:28:38.390915509 +0000 UTC m=+329.561775122" observedRunningTime="2025-12-02 07:28:38.7366037 +0000 UTC m=+329.907463323" watchObservedRunningTime="2025-12-02 07:28:38.740872844 +0000 UTC m=+329.911732457" Dec 02 07:28:38 crc kubenswrapper[4895]: I1202 07:28:38.755175 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" podStartSLOduration=1.755150654 podStartE2EDuration="1.755150654s" podCreationTimestamp="2025-12-02 07:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:28:38.7511898 +0000 UTC m=+329.922049413" watchObservedRunningTime="2025-12-02 07:28:38.755150654 +0000 UTC m=+329.926010267" Dec 02 07:28:39 crc kubenswrapper[4895]: I1202 07:28:39.654843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerStarted","Data":"5dafaa11de45e41d6ed7382cf66d08eef5bd6b19209beb2a56dc4e1c1844ec50"} Dec 02 07:28:39 crc kubenswrapper[4895]: I1202 07:28:39.657428 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerID="1b060752968384cf6255b356cbdd446b2ab2551d42b0ba68c2077ef36d86e936" exitCode=0 Dec 02 07:28:39 crc kubenswrapper[4895]: I1202 07:28:39.657520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerDied","Data":"1b060752968384cf6255b356cbdd446b2ab2551d42b0ba68c2077ef36d86e936"} Dec 02 07:28:39 crc kubenswrapper[4895]: I1202 07:28:39.661248 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qq7dd" Dec 02 07:28:39 crc kubenswrapper[4895]: I1202 07:28:39.692661 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nd26q" podStartSLOduration=3.248410215 podStartE2EDuration="5.692636385s" podCreationTimestamp="2025-12-02 07:28:34 +0000 UTC" firstStartedPulling="2025-12-02 07:28:36.596683375 +0000 UTC m=+327.767542988" lastFinishedPulling="2025-12-02 07:28:39.040909545 +0000 UTC m=+330.211769158" observedRunningTime="2025-12-02 07:28:39.688704472 +0000 UTC m=+330.859564105" watchObservedRunningTime="2025-12-02 07:28:39.692636385 +0000 UTC m=+330.863496008" Dec 02 07:28:40 crc kubenswrapper[4895]: I1202 07:28:40.676503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerStarted","Data":"11aa2ac12747d944fd956e9be2daf5fcddbb8ec86d165a31d87106882d869cf8"} Dec 02 07:28:40 crc kubenswrapper[4895]: I1202 07:28:40.693990 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcdj8" podStartSLOduration=2.220481206 podStartE2EDuration="4.69396703s" podCreationTimestamp="2025-12-02 07:28:36 +0000 UTC" firstStartedPulling="2025-12-02 07:28:37.619372091 +0000 UTC m=+328.790231704" lastFinishedPulling="2025-12-02 07:28:40.092857915 +0000 UTC m=+331.263717528" observedRunningTime="2025-12-02 07:28:40.69179175 +0000 UTC m=+331.862651363" watchObservedRunningTime="2025-12-02 07:28:40.69396703 +0000 UTC m=+331.864826643" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.694953 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.695302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.735649 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.741598 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.741659 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:43 crc kubenswrapper[4895]: I1202 07:28:43.783884 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:44 crc kubenswrapper[4895]: I1202 07:28:44.747332 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c6tgt" Dec 02 07:28:44 crc kubenswrapper[4895]: I1202 07:28:44.754966 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xzs6v" Dec 02 07:28:45 crc kubenswrapper[4895]: I1202 07:28:45.291914 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:45 crc kubenswrapper[4895]: I1202 07:28:45.291996 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:45 crc kubenswrapper[4895]: I1202 07:28:45.329581 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:45 crc kubenswrapper[4895]: I1202 07:28:45.744160 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:28:46 crc kubenswrapper[4895]: I1202 07:28:46.671519 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:46 crc kubenswrapper[4895]: I1202 07:28:46.671604 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:46 crc kubenswrapper[4895]: I1202 07:28:46.715286 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:46 crc kubenswrapper[4895]: I1202 07:28:46.768153 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 07:28:57 crc kubenswrapper[4895]: I1202 07:28:57.931588 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cm47r" Dec 02 07:28:58 crc kubenswrapper[4895]: I1202 07:28:58.000804 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:29:05 crc kubenswrapper[4895]: I1202 07:29:05.474521 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:29:05 crc kubenswrapper[4895]: I1202 07:29:05.475365 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.036262 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" podUID="4f1d0fc5-528f-4529-938f-7041be573fa7" containerName="registry" containerID="cri-o://7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1" gracePeriod=30 Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.505157 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.674692 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.674842 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.674890 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.675214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.675255 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tklzt\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.675296 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.675555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.675630 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted\") pod \"4f1d0fc5-528f-4529-938f-7041be573fa7\" (UID: \"4f1d0fc5-528f-4529-938f-7041be573fa7\") " Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.676959 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.677775 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.684026 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.684357 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.685477 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt" (OuterVolumeSpecName: "kube-api-access-tklzt") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "kube-api-access-tklzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.685787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.702491 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.704008 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4f1d0fc5-528f-4529-938f-7041be573fa7" (UID: "4f1d0fc5-528f-4529-938f-7041be573fa7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776842 4895 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776876 4895 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776889 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tklzt\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-kube-api-access-tklzt\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776898 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f1d0fc5-528f-4529-938f-7041be573fa7-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776906 4895 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f1d0fc5-528f-4529-938f-7041be573fa7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776914 4895 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0fc5-528f-4529-938f-7041be573fa7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.776922 4895 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f1d0fc5-528f-4529-938f-7041be573fa7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.966291 4895 generic.go:334] "Generic (PLEG): container finished" podID="4f1d0fc5-528f-4529-938f-7041be573fa7" containerID="7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1" exitCode=0 Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.966348 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.966369 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" event={"ID":"4f1d0fc5-528f-4529-938f-7041be573fa7","Type":"ContainerDied","Data":"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1"} Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.966928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tjqt9" event={"ID":"4f1d0fc5-528f-4529-938f-7041be573fa7","Type":"ContainerDied","Data":"14e3a3c4df1f81bc8db888569aed2e7b6ff37792719ec6f25fdbd9597470336f"} Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.967019 4895 scope.go:117] "RemoveContainer" containerID="7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.987831 4895 scope.go:117] "RemoveContainer" containerID="7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1" Dec 02 07:29:23 crc kubenswrapper[4895]: E1202 07:29:23.988596 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1\": container with ID starting with 7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1 not found: ID does not exist" containerID="7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1" Dec 02 07:29:23 crc kubenswrapper[4895]: I1202 07:29:23.988653 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1"} err="failed to get container status \"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1\": rpc error: code = NotFound desc = could not find container \"7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1\": container with ID starting with 7e04707a516deb55f156bc370efa1fb13eb20cfba589947b56e8b4d9038c22e1 not found: ID does not exist" Dec 02 07:29:24 crc kubenswrapper[4895]: I1202 07:29:24.012572 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:29:24 crc kubenswrapper[4895]: I1202 07:29:24.017599 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tjqt9"] Dec 02 07:29:25 crc kubenswrapper[4895]: I1202 07:29:25.149175 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1d0fc5-528f-4529-938f-7041be573fa7" path="/var/lib/kubelet/pods/4f1d0fc5-528f-4529-938f-7041be573fa7/volumes" Dec 02 07:29:35 crc kubenswrapper[4895]: I1202 07:29:35.473829 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:29:35 crc kubenswrapper[4895]: I1202 07:29:35.474669 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.195122 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7"] Dec 02 07:30:00 crc kubenswrapper[4895]: E1202 07:30:00.196385 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1d0fc5-528f-4529-938f-7041be573fa7" containerName="registry" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.196429 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1d0fc5-528f-4529-938f-7041be573fa7" containerName="registry" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.196580 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1d0fc5-528f-4529-938f-7041be573fa7" containerName="registry" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.197170 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.199659 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.200288 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.220010 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7"] Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.228681 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.228734 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxwd\" (UniqueName: \"kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.228792 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.331065 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.331259 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.331308 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxwd\" (UniqueName: \"kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.332235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.339661 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.352075 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxwd\" (UniqueName: \"kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd\") pod \"collect-profiles-29411010-cjlm7\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.531102 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:00 crc kubenswrapper[4895]: I1202 07:30:00.826529 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7"] Dec 02 07:30:01 crc kubenswrapper[4895]: I1202 07:30:01.256288 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a15c2bd-78d9-4178-95b8-170663887f4b" containerID="1a8b2be9a073f50336930908d5803e5e4de3461802484b0226855dd3218dec08" exitCode=0 Dec 02 07:30:01 crc kubenswrapper[4895]: I1202 07:30:01.256355 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" event={"ID":"1a15c2bd-78d9-4178-95b8-170663887f4b","Type":"ContainerDied","Data":"1a8b2be9a073f50336930908d5803e5e4de3461802484b0226855dd3218dec08"} Dec 02 07:30:01 crc kubenswrapper[4895]: I1202 07:30:01.256850 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" event={"ID":"1a15c2bd-78d9-4178-95b8-170663887f4b","Type":"ContainerStarted","Data":"46fd02104422d5ee797ef9c93bf11ebb4737fcb3699f2c05f96cf0e3d3aa322a"} Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.501851 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.563864 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume\") pod \"1a15c2bd-78d9-4178-95b8-170663887f4b\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.564010 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxwd\" (UniqueName: \"kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd\") pod \"1a15c2bd-78d9-4178-95b8-170663887f4b\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.564176 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume\") pod \"1a15c2bd-78d9-4178-95b8-170663887f4b\" (UID: \"1a15c2bd-78d9-4178-95b8-170663887f4b\") " Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.566064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a15c2bd-78d9-4178-95b8-170663887f4b" (UID: "1a15c2bd-78d9-4178-95b8-170663887f4b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.572515 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd" (OuterVolumeSpecName: "kube-api-access-mtxwd") pod "1a15c2bd-78d9-4178-95b8-170663887f4b" (UID: "1a15c2bd-78d9-4178-95b8-170663887f4b"). InnerVolumeSpecName "kube-api-access-mtxwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.575020 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a15c2bd-78d9-4178-95b8-170663887f4b" (UID: "1a15c2bd-78d9-4178-95b8-170663887f4b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.667430 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a15c2bd-78d9-4178-95b8-170663887f4b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.667488 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a15c2bd-78d9-4178-95b8-170663887f4b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:30:02 crc kubenswrapper[4895]: I1202 07:30:02.667510 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxwd\" (UniqueName: \"kubernetes.io/projected/1a15c2bd-78d9-4178-95b8-170663887f4b-kube-api-access-mtxwd\") on node \"crc\" DevicePath \"\"" Dec 02 07:30:03 crc kubenswrapper[4895]: I1202 07:30:03.270576 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" event={"ID":"1a15c2bd-78d9-4178-95b8-170663887f4b","Type":"ContainerDied","Data":"46fd02104422d5ee797ef9c93bf11ebb4737fcb3699f2c05f96cf0e3d3aa322a"} Dec 02 07:30:03 crc kubenswrapper[4895]: I1202 07:30:03.270895 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46fd02104422d5ee797ef9c93bf11ebb4737fcb3699f2c05f96cf0e3d3aa322a" Dec 02 07:30:03 crc kubenswrapper[4895]: I1202 07:30:03.270673 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7" Dec 02 07:30:05 crc kubenswrapper[4895]: I1202 07:30:05.473550 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:30:05 crc kubenswrapper[4895]: I1202 07:30:05.473630 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:30:05 crc kubenswrapper[4895]: I1202 07:30:05.473688 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:30:05 crc kubenswrapper[4895]: I1202 07:30:05.474550 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:30:05 crc kubenswrapper[4895]: I1202 07:30:05.474647 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb" gracePeriod=600 Dec 02 07:30:06 crc kubenswrapper[4895]: I1202 07:30:06.296045 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb" exitCode=0 Dec 02 07:30:06 crc kubenswrapper[4895]: I1202 07:30:06.296127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb"} Dec 02 07:30:06 crc kubenswrapper[4895]: I1202 07:30:06.296494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8"} Dec 02 07:30:06 crc kubenswrapper[4895]: I1202 07:30:06.296529 4895 scope.go:117] "RemoveContainer" containerID="1474e8f5989607f3b95fefe811e3d787320f04c223edb6ffe47d2b37863ea874" Dec 02 07:32:05 crc kubenswrapper[4895]: I1202 07:32:05.474117 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:32:05 crc kubenswrapper[4895]: I1202 07:32:05.475051 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:32:35 crc kubenswrapper[4895]: I1202 07:32:35.473332 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:32:35 crc kubenswrapper[4895]: I1202 07:32:35.474550 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:33:05 crc kubenswrapper[4895]: I1202 07:33:05.473687 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:33:05 crc kubenswrapper[4895]: I1202 07:33:05.474364 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:33:05 crc kubenswrapper[4895]: I1202 07:33:05.474422 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:33:05 crc kubenswrapper[4895]: I1202 07:33:05.475955 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:33:05 crc kubenswrapper[4895]: I1202 07:33:05.476217 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8" gracePeriod=600 Dec 02 07:33:06 crc kubenswrapper[4895]: I1202 07:33:06.474042 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8" exitCode=0 Dec 02 07:33:06 crc kubenswrapper[4895]: I1202 07:33:06.474112 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8"} Dec 02 07:33:06 crc kubenswrapper[4895]: I1202 07:33:06.474851 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336"} Dec 02 07:33:06 crc kubenswrapper[4895]: I1202 07:33:06.474884 4895 scope.go:117] "RemoveContainer" containerID="33bcab746983782363230aa67a92ba155150d831e649f607a859abb514a5bffb" Dec 02 07:35:05 crc kubenswrapper[4895]: I1202 07:35:05.473446 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:35:05 crc kubenswrapper[4895]: I1202 07:35:05.474101 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:35:35 crc kubenswrapper[4895]: I1202 07:35:35.473558 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:35:35 crc kubenswrapper[4895]: I1202 07:35:35.474305 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:35:45 crc kubenswrapper[4895]: I1202 07:35:45.665950 4895 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.473551 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.475108 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.475175 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.475704 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.475792 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336" gracePeriod=600 Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.931586 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336" exitCode=0 Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.931662 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336"} Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.931917 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b"} Dec 02 07:36:05 crc kubenswrapper[4895]: I1202 07:36:05.931968 4895 scope.go:117] "RemoveContainer" containerID="f630f4a762a626bd9db4cd84a33eafc7b3562342891e999006625e4aa5d1c1a8" Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.850363 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w54m4"] Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852387 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852421 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="sbdb" containerID="cri-o://d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852465 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="northd" containerID="cri-o://f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852374 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="nbdb" containerID="cri-o://80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852501 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-node" containerID="cri-o://91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852588 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-acl-logging" containerID="cri-o://5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.852296 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-controller" containerID="cri-o://ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.887390 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" containerID="cri-o://cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" gracePeriod=30 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.993789 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.997325 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-acl-logging/0.log" Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.997928 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-controller/0.log" Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.998428 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" exitCode=0 Dec 02 07:36:13 crc kubenswrapper[4895]: I1202 07:36:13.998513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.007777 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/2.log" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.008362 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/1.log" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.008415 4895 generic.go:334] "Generic (PLEG): container finished" podID="30911fe5-208f-44e8-a380-2a0093f24863" containerID="da2cfa8cea74106ff83eeb39671986f152230f16024c55af625bfdc4dca6a73d" exitCode=2 Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.008518 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerDied","Data":"da2cfa8cea74106ff83eeb39671986f152230f16024c55af625bfdc4dca6a73d"} Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.008602 4895 scope.go:117] "RemoveContainer" containerID="a569570ff32547c25fcdced649773ea0ab6d3aeccdaa5c26aa5a86d2c745535f" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.009970 4895 scope.go:117] "RemoveContainer" containerID="da2cfa8cea74106ff83eeb39671986f152230f16024c55af625bfdc4dca6a73d" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.203136 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.205624 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-acl-logging/0.log" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.206136 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-controller/0.log" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.206570 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.272471 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9t5xx"] Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.272861 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kubecfg-setup" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.272935 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kubecfg-setup" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.272995 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="nbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273045 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="nbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273101 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-acl-logging" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273151 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-acl-logging" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273211 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273260 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273312 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273362 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273419 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a15c2bd-78d9-4178-95b8-170663887f4b" containerName="collect-profiles" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273467 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a15c2bd-78d9-4178-95b8-170663887f4b" containerName="collect-profiles" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273524 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273576 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273625 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="northd" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273671 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="northd" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273731 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273866 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.273923 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.273968 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.274018 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274067 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.274190 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274246 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.274295 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-node" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274343 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-node" Dec 02 07:36:14 crc kubenswrapper[4895]: E1202 07:36:14.274397 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="sbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274445 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="sbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274600 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274658 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a15c2bd-78d9-4178-95b8-170663887f4b" containerName="collect-profiles" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274713 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274822 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274898 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="northd" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.274966 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275027 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="nbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275094 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275160 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovn-acl-logging" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275216 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="sbdb" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275272 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="kube-rbac-proxy-node" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275523 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.275813 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerName="ovnkube-controller" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.277576 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340557 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340652 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340675 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340708 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340761 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ldkg\" (UniqueName: \"kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340755 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket" (OuterVolumeSpecName: "log-socket") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340776 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340802 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340818 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340859 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340904 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340917 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340950 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340966 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.340987 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341003 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341034 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341047 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns\") pod \"afc3334a-0153-4dcc-9a56-92f6cae51c08\" (UID: \"afc3334a-0153-4dcc-9a56-92f6cae51c08\") " Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341117 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341198 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341403 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341539 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341586 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341877 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341939 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash" (OuterVolumeSpecName: "host-slash") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341964 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342110 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342255 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342272 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342292 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342313 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342320 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342336 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342357 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log" (OuterVolumeSpecName: "node-log") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.341931 4895 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342758 4895 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.342771 4895 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.347923 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.348505 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg" (OuterVolumeSpecName: "kube-api-access-5ldkg") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "kube-api-access-5ldkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.358406 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "afc3334a-0153-4dcc-9a56-92f6cae51c08" (UID: "afc3334a-0153-4dcc-9a56-92f6cae51c08"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovn-node-metrics-cert\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444194 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-netns\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-script-lib\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-systemd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444284 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-log-socket\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444718 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-node-log\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-config\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.444915 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-netd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445054 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-env-overrides\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445110 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-ovn\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445153 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-var-lib-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445191 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-bin\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-systemd-units\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445276 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2bg\" (UniqueName: \"kubernetes.io/projected/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-kube-api-access-mn2bg\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-kubelet\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445361 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-slash\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445509 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-etc-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445653 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ldkg\" (UniqueName: \"kubernetes.io/projected/afc3334a-0153-4dcc-9a56-92f6cae51c08-kube-api-access-5ldkg\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445679 4895 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445700 4895 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445719 4895 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445772 4895 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445799 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445819 4895 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445837 4895 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445856 4895 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445876 4895 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445894 4895 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445913 4895 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/afc3334a-0153-4dcc-9a56-92f6cae51c08-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445931 4895 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445948 4895 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445968 4895 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.445985 4895 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afc3334a-0153-4dcc-9a56-92f6cae51c08-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547415 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-etc-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547473 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovn-node-metrics-cert\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547496 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-netns\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547523 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-script-lib\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-systemd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547565 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547589 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-log-socket\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547623 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547648 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-node-log\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547666 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-config\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547685 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-netd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547715 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-env-overrides\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547769 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-ovn\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547794 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-var-lib-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547814 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-bin\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547836 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-systemd-units\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547856 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2bg\" (UniqueName: \"kubernetes.io/projected/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-kube-api-access-mn2bg\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-kubelet\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547894 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547914 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-slash\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-log-socket\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.547982 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-slash\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.548035 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.548029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-etc-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.548074 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-node-log\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.548402 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-bin\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.548817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-cni-netd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549013 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-config\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-systemd\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-systemd-units\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-var-lib-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549273 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-env-overrides\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549281 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-ovn\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549311 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-run-openvswitch\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549384 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-run-netns\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549220 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-host-kubelet\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.549758 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovnkube-script-lib\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.551655 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-ovn-node-metrics-cert\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.567838 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2bg\" (UniqueName: \"kubernetes.io/projected/2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3-kube-api-access-mn2bg\") pod \"ovnkube-node-9t5xx\" (UID: \"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3\") " pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: I1202 07:36:14.600523 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:14 crc kubenswrapper[4895]: W1202 07:36:14.619463 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d53304e_5b74_4f0a_b9fb_e0bb2c61eee3.slice/crio-c3df5238bc27b35b27cf430e50d6ba7474691c9212abb3f5eeb51e89b2c49b04 WatchSource:0}: Error finding container c3df5238bc27b35b27cf430e50d6ba7474691c9212abb3f5eeb51e89b2c49b04: Status 404 returned error can't find the container with id c3df5238bc27b35b27cf430e50d6ba7474691c9212abb3f5eeb51e89b2c49b04 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.015666 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hlxqt_30911fe5-208f-44e8-a380-2a0093f24863/kube-multus/2.log" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.016229 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hlxqt" event={"ID":"30911fe5-208f-44e8-a380-2a0093f24863","Type":"ContainerStarted","Data":"20aca201e9012edca0988a7bd621e40805ffd557746760b1f6f16c72e30ca143"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.019804 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovnkube-controller/3.log" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.022219 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-acl-logging/0.log" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.022710 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w54m4_afc3334a-0153-4dcc-9a56-92f6cae51c08/ovn-controller/0.log" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023045 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023070 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023080 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023088 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023095 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023102 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" exitCode=143 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023109 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc3334a-0153-4dcc-9a56-92f6cae51c08" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" exitCode=143 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023183 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023256 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023271 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023288 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023298 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023319 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023332 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023335 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023337 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023419 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023431 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023437 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023444 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023449 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023456 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023474 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023492 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023499 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023504 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023509 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023514 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023520 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023525 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023530 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023536 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023541 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w54m4" event={"ID":"afc3334a-0153-4dcc-9a56-92f6cae51c08","Type":"ContainerDied","Data":"49b65f34552e500042cf1b7a788b223005dea6e388c4dde91984023ebc9f0827"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023555 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023565 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023572 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023579 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023586 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023593 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023600 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023607 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023613 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.023620 4895 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.026317 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3" containerID="860797ea3f03e333e4cde004e9dd60ba803f841b247de2c5d5f86ec5c2f1579c" exitCode=0 Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.026362 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerDied","Data":"860797ea3f03e333e4cde004e9dd60ba803f841b247de2c5d5f86ec5c2f1579c"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.026392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"c3df5238bc27b35b27cf430e50d6ba7474691c9212abb3f5eeb51e89b2c49b04"} Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.074975 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.110360 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w54m4"] Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.114409 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w54m4"] Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.120034 4895 scope.go:117] "RemoveContainer" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.149475 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afc3334a-0153-4dcc-9a56-92f6cae51c08" path="/var/lib/kubelet/pods/afc3334a-0153-4dcc-9a56-92f6cae51c08/volumes" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.150891 4895 scope.go:117] "RemoveContainer" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.177464 4895 scope.go:117] "RemoveContainer" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.193247 4895 scope.go:117] "RemoveContainer" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.212031 4895 scope.go:117] "RemoveContainer" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.245195 4895 scope.go:117] "RemoveContainer" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.279122 4895 scope.go:117] "RemoveContainer" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.303812 4895 scope.go:117] "RemoveContainer" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.331396 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.334324 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.334383 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} err="failed to get container status \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.334421 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.334651 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": container with ID starting with 52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228 not found: ID does not exist" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.334675 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} err="failed to get container status \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": rpc error: code = NotFound desc = could not find container \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": container with ID starting with 52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.334692 4895 scope.go:117] "RemoveContainer" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.335254 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": container with ID starting with d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306 not found: ID does not exist" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.335320 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} err="failed to get container status \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": rpc error: code = NotFound desc = could not find container \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": container with ID starting with d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.335355 4895 scope.go:117] "RemoveContainer" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.336017 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": container with ID starting with 80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e not found: ID does not exist" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336058 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} err="failed to get container status \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": rpc error: code = NotFound desc = could not find container \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": container with ID starting with 80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336079 4895 scope.go:117] "RemoveContainer" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.336444 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": container with ID starting with f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273 not found: ID does not exist" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336474 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} err="failed to get container status \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": rpc error: code = NotFound desc = could not find container \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": container with ID starting with f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336495 4895 scope.go:117] "RemoveContainer" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.336783 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": container with ID starting with 5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f not found: ID does not exist" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336810 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} err="failed to get container status \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": rpc error: code = NotFound desc = could not find container \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": container with ID starting with 5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.336829 4895 scope.go:117] "RemoveContainer" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.337060 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": container with ID starting with 91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36 not found: ID does not exist" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337086 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} err="failed to get container status \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": rpc error: code = NotFound desc = could not find container \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": container with ID starting with 91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337102 4895 scope.go:117] "RemoveContainer" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.337309 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": container with ID starting with 5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a not found: ID does not exist" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337327 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} err="failed to get container status \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": rpc error: code = NotFound desc = could not find container \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": container with ID starting with 5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337341 4895 scope.go:117] "RemoveContainer" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.337535 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": container with ID starting with ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573 not found: ID does not exist" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337554 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} err="failed to get container status \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": rpc error: code = NotFound desc = could not find container \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": container with ID starting with ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337566 4895 scope.go:117] "RemoveContainer" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: E1202 07:36:15.337760 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": container with ID starting with 328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28 not found: ID does not exist" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337784 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} err="failed to get container status \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": rpc error: code = NotFound desc = could not find container \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": container with ID starting with 328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337799 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337968 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} err="failed to get container status \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.337984 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338228 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} err="failed to get container status \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": rpc error: code = NotFound desc = could not find container \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": container with ID starting with 52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338243 4895 scope.go:117] "RemoveContainer" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338459 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} err="failed to get container status \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": rpc error: code = NotFound desc = could not find container \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": container with ID starting with d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338475 4895 scope.go:117] "RemoveContainer" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338684 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} err="failed to get container status \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": rpc error: code = NotFound desc = could not find container \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": container with ID starting with 80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338701 4895 scope.go:117] "RemoveContainer" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338938 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} err="failed to get container status \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": rpc error: code = NotFound desc = could not find container \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": container with ID starting with f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.338954 4895 scope.go:117] "RemoveContainer" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339129 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} err="failed to get container status \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": rpc error: code = NotFound desc = could not find container \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": container with ID starting with 5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339145 4895 scope.go:117] "RemoveContainer" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339343 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} err="failed to get container status \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": rpc error: code = NotFound desc = could not find container \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": container with ID starting with 91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339361 4895 scope.go:117] "RemoveContainer" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339574 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} err="failed to get container status \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": rpc error: code = NotFound desc = could not find container \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": container with ID starting with 5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339590 4895 scope.go:117] "RemoveContainer" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339811 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} err="failed to get container status \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": rpc error: code = NotFound desc = could not find container \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": container with ID starting with ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.339828 4895 scope.go:117] "RemoveContainer" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340103 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} err="failed to get container status \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": rpc error: code = NotFound desc = could not find container \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": container with ID starting with 328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340122 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340437 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} err="failed to get container status \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340452 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340667 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} err="failed to get container status \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": rpc error: code = NotFound desc = could not find container \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": container with ID starting with 52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340682 4895 scope.go:117] "RemoveContainer" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340905 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} err="failed to get container status \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": rpc error: code = NotFound desc = could not find container \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": container with ID starting with d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.340923 4895 scope.go:117] "RemoveContainer" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.341810 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} err="failed to get container status \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": rpc error: code = NotFound desc = could not find container \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": container with ID starting with 80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.341829 4895 scope.go:117] "RemoveContainer" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342045 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} err="failed to get container status \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": rpc error: code = NotFound desc = could not find container \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": container with ID starting with f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342063 4895 scope.go:117] "RemoveContainer" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342303 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} err="failed to get container status \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": rpc error: code = NotFound desc = could not find container \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": container with ID starting with 5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342329 4895 scope.go:117] "RemoveContainer" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342570 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} err="failed to get container status \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": rpc error: code = NotFound desc = could not find container \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": container with ID starting with 91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342599 4895 scope.go:117] "RemoveContainer" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342810 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} err="failed to get container status \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": rpc error: code = NotFound desc = could not find container \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": container with ID starting with 5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.342828 4895 scope.go:117] "RemoveContainer" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343113 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} err="failed to get container status \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": rpc error: code = NotFound desc = could not find container \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": container with ID starting with ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343138 4895 scope.go:117] "RemoveContainer" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343345 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} err="failed to get container status \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": rpc error: code = NotFound desc = could not find container \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": container with ID starting with 328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343369 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343676 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} err="failed to get container status \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343704 4895 scope.go:117] "RemoveContainer" containerID="52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.343998 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228"} err="failed to get container status \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": rpc error: code = NotFound desc = could not find container \"52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228\": container with ID starting with 52bad85cc5d71d65071ede7a4c939c8f773468925d9bb31b09a37641d870c228 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.344023 4895 scope.go:117] "RemoveContainer" containerID="d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.344307 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306"} err="failed to get container status \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": rpc error: code = NotFound desc = could not find container \"d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306\": container with ID starting with d6447684fe6189e38c9caad6dfe1384b7af01d50e2ef5f889c7b2874ba092306 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.344331 4895 scope.go:117] "RemoveContainer" containerID="80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.344642 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e"} err="failed to get container status \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": rpc error: code = NotFound desc = could not find container \"80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e\": container with ID starting with 80dca705ddb94ba799eb72bfc23b0c3878cceca48d6d10f1c7f8eb0e4beed40e not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.344667 4895 scope.go:117] "RemoveContainer" containerID="f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345046 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273"} err="failed to get container status \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": rpc error: code = NotFound desc = could not find container \"f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273\": container with ID starting with f54ae324450e73bcbf2c408794171a92bbb66c24d05f29584ac2c247090c1273 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345098 4895 scope.go:117] "RemoveContainer" containerID="5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345415 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f"} err="failed to get container status \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": rpc error: code = NotFound desc = could not find container \"5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f\": container with ID starting with 5463b077f0bc8b5d38e27854ece6a71f74f7769f4552c32eefe16854993f290f not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345436 4895 scope.go:117] "RemoveContainer" containerID="91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345848 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36"} err="failed to get container status \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": rpc error: code = NotFound desc = could not find container \"91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36\": container with ID starting with 91b0c6e50085a7be0924e5a54ef006110a7f13556fe47f7f55124fd6b2f0bc36 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.345901 4895 scope.go:117] "RemoveContainer" containerID="5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.346323 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a"} err="failed to get container status \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": rpc error: code = NotFound desc = could not find container \"5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a\": container with ID starting with 5f330d728eecd672ecb554744a51ebfb2c104441d3a85fe0c8a6fc3446c37e4a not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.346341 4895 scope.go:117] "RemoveContainer" containerID="ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.347360 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573"} err="failed to get container status \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": rpc error: code = NotFound desc = could not find container \"ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573\": container with ID starting with ca732cac495c410d89da4ce1476acd20905716cbb2baeedac9c52b654ff58573 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.347387 4895 scope.go:117] "RemoveContainer" containerID="328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.347634 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28"} err="failed to get container status \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": rpc error: code = NotFound desc = could not find container \"328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28\": container with ID starting with 328fa773c92ef0a965065412d4141c0b41ffe38d21ed169386a9d30d05f37f28 not found: ID does not exist" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.347659 4895 scope.go:117] "RemoveContainer" containerID="cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1" Dec 02 07:36:15 crc kubenswrapper[4895]: I1202 07:36:15.347981 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1"} err="failed to get container status \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": rpc error: code = NotFound desc = could not find container \"cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1\": container with ID starting with cd338136e3f94adb272a6ccce660c3f7a51db40e100162b0af875d6814db30d1 not found: ID does not exist" Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044208 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"4962da384f395c9b97a29dd81ccbb4bc63bd535ff70f0b0e9b028294a4f49125"} Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044766 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"901820e78c08b4f3dc1cd0adfe757f51fbaedd3c2e8002a216bfe5568c8e0a63"} Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044786 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"6ce30c717e3dcede09b908802dd3bf4598ce332f5305dc5c1a980d29fa2de11d"} Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"603039cb6a7719440162801dfba3a4d0ca7497ab50b7ef8da31da8264ee42e65"} Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044811 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"0e9e6fcd96bf79c9fa897d249d0cac4510660bbe85f09d0d9a7b7ed9977234c0"} Dec 02 07:36:16 crc kubenswrapper[4895]: I1202 07:36:16.044822 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"d3ec99cfa4e61abacc522c8a8e681ff2bf0a9eeaaa2e56eed363a14edd8a9812"} Dec 02 07:36:18 crc kubenswrapper[4895]: I1202 07:36:18.058515 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"53cde06afe8cc8d6afe3934d074811b2ea5c103fa78a8263a212a794a74bfc0c"} Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.079825 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" event={"ID":"2d53304e-5b74-4f0a-b9fb-e0bb2c61eee3","Type":"ContainerStarted","Data":"562b10179bdd788353af7561f0cfda2360ffaee2e6781a62543f448c3927798c"} Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.080409 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.080425 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.080435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.107332 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.112723 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" podStartSLOduration=7.112699411 podStartE2EDuration="7.112699411s" podCreationTimestamp="2025-12-02 07:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:36:21.107844741 +0000 UTC m=+792.278704364" watchObservedRunningTime="2025-12-02 07:36:21.112699411 +0000 UTC m=+792.283559034" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.113615 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.962709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xbsj8"] Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.964228 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.966327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pcv42" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.966478 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.966686 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.967230 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 02 07:36:21 crc kubenswrapper[4895]: I1202 07:36:21.978272 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xbsj8"] Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.054108 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.054173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.054246 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mml\" (UniqueName: \"kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.155084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5mml\" (UniqueName: \"kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.156086 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.156224 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.156530 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.156947 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.176672 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5mml\" (UniqueName: \"kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml\") pod \"crc-storage-crc-xbsj8\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: I1202 07:36:22.278469 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: E1202 07:36:22.304256 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(9678c27b6758dd530df7797e3b15dff36180b08b02f1f2cc9d06ed7ccf11511e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:36:22 crc kubenswrapper[4895]: E1202 07:36:22.304337 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(9678c27b6758dd530df7797e3b15dff36180b08b02f1f2cc9d06ed7ccf11511e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: E1202 07:36:22.304360 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(9678c27b6758dd530df7797e3b15dff36180b08b02f1f2cc9d06ed7ccf11511e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:22 crc kubenswrapper[4895]: E1202 07:36:22.304412 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-xbsj8_crc-storage(1e33d18d-577d-4610-9653-d3acf0bd9578)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-xbsj8_crc-storage(1e33d18d-577d-4610-9653-d3acf0bd9578)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(9678c27b6758dd530df7797e3b15dff36180b08b02f1f2cc9d06ed7ccf11511e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-xbsj8" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" Dec 02 07:36:23 crc kubenswrapper[4895]: I1202 07:36:23.092198 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:23 crc kubenswrapper[4895]: I1202 07:36:23.093209 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:23 crc kubenswrapper[4895]: E1202 07:36:23.123684 4895 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(c8edf5240072a2796695d0667412031aba9c6b4e76f6e8b80044306fbea29e19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:36:23 crc kubenswrapper[4895]: E1202 07:36:23.123772 4895 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(c8edf5240072a2796695d0667412031aba9c6b4e76f6e8b80044306fbea29e19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:23 crc kubenswrapper[4895]: E1202 07:36:23.123800 4895 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(c8edf5240072a2796695d0667412031aba9c6b4e76f6e8b80044306fbea29e19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:23 crc kubenswrapper[4895]: E1202 07:36:23.123850 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-xbsj8_crc-storage(1e33d18d-577d-4610-9653-d3acf0bd9578)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-xbsj8_crc-storage(1e33d18d-577d-4610-9653-d3acf0bd9578)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-xbsj8_crc-storage_1e33d18d-577d-4610-9653-d3acf0bd9578_0(c8edf5240072a2796695d0667412031aba9c6b4e76f6e8b80044306fbea29e19): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-xbsj8" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" Dec 02 07:36:34 crc kubenswrapper[4895]: I1202 07:36:34.141222 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:34 crc kubenswrapper[4895]: I1202 07:36:34.143824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:34 crc kubenswrapper[4895]: I1202 07:36:34.395535 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xbsj8"] Dec 02 07:36:34 crc kubenswrapper[4895]: W1202 07:36:34.400288 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e33d18d_577d_4610_9653_d3acf0bd9578.slice/crio-77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2 WatchSource:0}: Error finding container 77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2: Status 404 returned error can't find the container with id 77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2 Dec 02 07:36:34 crc kubenswrapper[4895]: I1202 07:36:34.402944 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:36:35 crc kubenswrapper[4895]: I1202 07:36:35.158627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xbsj8" event={"ID":"1e33d18d-577d-4610-9653-d3acf0bd9578","Type":"ContainerStarted","Data":"77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2"} Dec 02 07:36:36 crc kubenswrapper[4895]: I1202 07:36:36.168363 4895 generic.go:334] "Generic (PLEG): container finished" podID="1e33d18d-577d-4610-9653-d3acf0bd9578" containerID="0744137f31f20b6d47db9ff1933beb8d7bdd7b25221e6e5d3687702c5553e529" exitCode=0 Dec 02 07:36:36 crc kubenswrapper[4895]: I1202 07:36:36.168821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xbsj8" event={"ID":"1e33d18d-577d-4610-9653-d3acf0bd9578","Type":"ContainerDied","Data":"0744137f31f20b6d47db9ff1933beb8d7bdd7b25221e6e5d3687702c5553e529"} Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.437780 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.443309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5mml\" (UniqueName: \"kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml\") pod \"1e33d18d-577d-4610-9653-d3acf0bd9578\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.443369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage\") pod \"1e33d18d-577d-4610-9653-d3acf0bd9578\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.443417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt\") pod \"1e33d18d-577d-4610-9653-d3acf0bd9578\" (UID: \"1e33d18d-577d-4610-9653-d3acf0bd9578\") " Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.443793 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1e33d18d-577d-4610-9653-d3acf0bd9578" (UID: "1e33d18d-577d-4610-9653-d3acf0bd9578"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.448737 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml" (OuterVolumeSpecName: "kube-api-access-v5mml") pod "1e33d18d-577d-4610-9653-d3acf0bd9578" (UID: "1e33d18d-577d-4610-9653-d3acf0bd9578"). InnerVolumeSpecName "kube-api-access-v5mml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.460611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1e33d18d-577d-4610-9653-d3acf0bd9578" (UID: "1e33d18d-577d-4610-9653-d3acf0bd9578"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.547416 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5mml\" (UniqueName: \"kubernetes.io/projected/1e33d18d-577d-4610-9653-d3acf0bd9578-kube-api-access-v5mml\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.547451 4895 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e33d18d-577d-4610-9653-d3acf0bd9578-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:37 crc kubenswrapper[4895]: I1202 07:36:37.547462 4895 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e33d18d-577d-4610-9653-d3acf0bd9578-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:38 crc kubenswrapper[4895]: I1202 07:36:38.182467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xbsj8" event={"ID":"1e33d18d-577d-4610-9653-d3acf0bd9578","Type":"ContainerDied","Data":"77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2"} Dec 02 07:36:38 crc kubenswrapper[4895]: I1202 07:36:38.182514 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77cbac7cc10741cf2b2c875cef7703aa0ebe682bc76c0ca65c69ae7cdd14a9b2" Dec 02 07:36:38 crc kubenswrapper[4895]: I1202 07:36:38.183072 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xbsj8" Dec 02 07:36:44 crc kubenswrapper[4895]: I1202 07:36:44.626424 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9t5xx" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.459523 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7"] Dec 02 07:36:45 crc kubenswrapper[4895]: E1202 07:36:45.460103 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" containerName="storage" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.460117 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" containerName="storage" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.460213 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" containerName="storage" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.460951 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.463048 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.473627 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7"] Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.548231 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.548290 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.548540 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdhd\" (UniqueName: \"kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.649456 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdhd\" (UniqueName: \"kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.649520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.649543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.650118 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.650184 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.669289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdhd\" (UniqueName: \"kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:45 crc kubenswrapper[4895]: I1202 07:36:45.790132 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:46 crc kubenswrapper[4895]: I1202 07:36:46.029843 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7"] Dec 02 07:36:46 crc kubenswrapper[4895]: I1202 07:36:46.233138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerStarted","Data":"08ad1cf8ae29bd82abfeb1eeaf60c6e5c3c8fa8948f475b742a067e2d7220bf9"} Dec 02 07:36:46 crc kubenswrapper[4895]: I1202 07:36:46.233189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerStarted","Data":"b2ba21afd02fa4b016e4e7410a9e785fc7faf8b5f36ee713b779c0970f24320a"} Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.241109 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a467b75-3cba-434a-aa67-c823cb289396" containerID="08ad1cf8ae29bd82abfeb1eeaf60c6e5c3c8fa8948f475b742a067e2d7220bf9" exitCode=0 Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.241201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerDied","Data":"08ad1cf8ae29bd82abfeb1eeaf60c6e5c3c8fa8948f475b742a067e2d7220bf9"} Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.821500 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.823003 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.842594 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.879996 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.880125 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg676\" (UniqueName: \"kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.880172 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.981101 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.981196 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.981338 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg676\" (UniqueName: \"kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.981860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:47 crc kubenswrapper[4895]: I1202 07:36:47.981950 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:48 crc kubenswrapper[4895]: I1202 07:36:48.010410 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg676\" (UniqueName: \"kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676\") pod \"redhat-operators-xtwld\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:48 crc kubenswrapper[4895]: I1202 07:36:48.177322 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:48 crc kubenswrapper[4895]: I1202 07:36:48.406100 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:36:49 crc kubenswrapper[4895]: I1202 07:36:49.254868 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a467b75-3cba-434a-aa67-c823cb289396" containerID="3162c118a83d6ba135248e1d4613bfb56646c6c76f5a4da6790dc336e59bccb8" exitCode=0 Dec 02 07:36:49 crc kubenswrapper[4895]: I1202 07:36:49.254993 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerDied","Data":"3162c118a83d6ba135248e1d4613bfb56646c6c76f5a4da6790dc336e59bccb8"} Dec 02 07:36:49 crc kubenswrapper[4895]: I1202 07:36:49.256977 4895 generic.go:334] "Generic (PLEG): container finished" podID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerID="612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35" exitCode=0 Dec 02 07:36:49 crc kubenswrapper[4895]: I1202 07:36:49.257053 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerDied","Data":"612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35"} Dec 02 07:36:49 crc kubenswrapper[4895]: I1202 07:36:49.257106 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerStarted","Data":"29c5d34d66e5e083749be51df43e382aea7a0d2455b9539442d64904a33bed85"} Dec 02 07:36:50 crc kubenswrapper[4895]: I1202 07:36:50.278915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerDied","Data":"8744e7dea4c5694a34d9c4095fe98e2bff9934974d40ecd9ae46d63492b450b0"} Dec 02 07:36:50 crc kubenswrapper[4895]: I1202 07:36:50.278865 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a467b75-3cba-434a-aa67-c823cb289396" containerID="8744e7dea4c5694a34d9c4095fe98e2bff9934974d40ecd9ae46d63492b450b0" exitCode=0 Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.289814 4895 generic.go:334] "Generic (PLEG): container finished" podID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerID="a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b" exitCode=0 Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.289934 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerDied","Data":"a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b"} Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.584884 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.634395 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tdhd\" (UniqueName: \"kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd\") pod \"3a467b75-3cba-434a-aa67-c823cb289396\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.634615 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle\") pod \"3a467b75-3cba-434a-aa67-c823cb289396\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.634663 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util\") pod \"3a467b75-3cba-434a-aa67-c823cb289396\" (UID: \"3a467b75-3cba-434a-aa67-c823cb289396\") " Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.636315 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle" (OuterVolumeSpecName: "bundle") pod "3a467b75-3cba-434a-aa67-c823cb289396" (UID: "3a467b75-3cba-434a-aa67-c823cb289396"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.643030 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd" (OuterVolumeSpecName: "kube-api-access-7tdhd") pod "3a467b75-3cba-434a-aa67-c823cb289396" (UID: "3a467b75-3cba-434a-aa67-c823cb289396"). InnerVolumeSpecName "kube-api-access-7tdhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.736648 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:51 crc kubenswrapper[4895]: I1202 07:36:51.736704 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tdhd\" (UniqueName: \"kubernetes.io/projected/3a467b75-3cba-434a-aa67-c823cb289396-kube-api-access-7tdhd\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.115622 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util" (OuterVolumeSpecName: "util") pod "3a467b75-3cba-434a-aa67-c823cb289396" (UID: "3a467b75-3cba-434a-aa67-c823cb289396"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.143628 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a467b75-3cba-434a-aa67-c823cb289396-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.300153 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerStarted","Data":"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f"} Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.302424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" event={"ID":"3a467b75-3cba-434a-aa67-c823cb289396","Type":"ContainerDied","Data":"b2ba21afd02fa4b016e4e7410a9e785fc7faf8b5f36ee713b779c0970f24320a"} Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.302467 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ba21afd02fa4b016e4e7410a9e785fc7faf8b5f36ee713b779c0970f24320a" Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.302474 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7" Dec 02 07:36:52 crc kubenswrapper[4895]: I1202 07:36:52.323916 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xtwld" podStartSLOduration=2.442958088 podStartE2EDuration="5.323894071s" podCreationTimestamp="2025-12-02 07:36:47 +0000 UTC" firstStartedPulling="2025-12-02 07:36:49.258535298 +0000 UTC m=+820.429394941" lastFinishedPulling="2025-12-02 07:36:52.139471311 +0000 UTC m=+823.310330924" observedRunningTime="2025-12-02 07:36:52.322098055 +0000 UTC m=+823.492957688" watchObservedRunningTime="2025-12-02 07:36:52.323894071 +0000 UTC m=+823.494753694" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.734445 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr"] Dec 02 07:36:55 crc kubenswrapper[4895]: E1202 07:36:55.735161 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="extract" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.735177 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="extract" Dec 02 07:36:55 crc kubenswrapper[4895]: E1202 07:36:55.735187 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="pull" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.735192 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="pull" Dec 02 07:36:55 crc kubenswrapper[4895]: E1202 07:36:55.735205 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="util" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.735211 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="util" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.735304 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a467b75-3cba-434a-aa67-c823cb289396" containerName="extract" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.735841 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.738969 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.738994 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vtmbh" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.739130 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.756126 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr"] Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.796975 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg7g\" (UniqueName: \"kubernetes.io/projected/2586c411-cce4-4ade-af50-5d2b0c5ee2b6-kube-api-access-bfg7g\") pod \"nmstate-operator-5b5b58f5c8-67fqr\" (UID: \"2586c411-cce4-4ade-af50-5d2b0c5ee2b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.898240 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg7g\" (UniqueName: \"kubernetes.io/projected/2586c411-cce4-4ade-af50-5d2b0c5ee2b6-kube-api-access-bfg7g\") pod \"nmstate-operator-5b5b58f5c8-67fqr\" (UID: \"2586c411-cce4-4ade-af50-5d2b0c5ee2b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" Dec 02 07:36:55 crc kubenswrapper[4895]: I1202 07:36:55.922443 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg7g\" (UniqueName: \"kubernetes.io/projected/2586c411-cce4-4ade-af50-5d2b0c5ee2b6-kube-api-access-bfg7g\") pod \"nmstate-operator-5b5b58f5c8-67fqr\" (UID: \"2586c411-cce4-4ade-af50-5d2b0c5ee2b6\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" Dec 02 07:36:56 crc kubenswrapper[4895]: I1202 07:36:56.056839 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" Dec 02 07:36:56 crc kubenswrapper[4895]: I1202 07:36:56.288836 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr"] Dec 02 07:36:56 crc kubenswrapper[4895]: I1202 07:36:56.338951 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" event={"ID":"2586c411-cce4-4ade-af50-5d2b0c5ee2b6","Type":"ContainerStarted","Data":"d7c66ab74a0a707a482258f75d67659bb3bbd4769b8399c999e6f621b8927f01"} Dec 02 07:36:58 crc kubenswrapper[4895]: I1202 07:36:58.178199 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:58 crc kubenswrapper[4895]: I1202 07:36:58.178590 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:58 crc kubenswrapper[4895]: I1202 07:36:58.223180 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:58 crc kubenswrapper[4895]: I1202 07:36:58.395638 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:36:59 crc kubenswrapper[4895]: I1202 07:36:59.356932 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" event={"ID":"2586c411-cce4-4ade-af50-5d2b0c5ee2b6","Type":"ContainerStarted","Data":"9bccd5005eb8368f12d77ba6334bfbd7dc063c3ecd211fd2c314bc84e9751b84"} Dec 02 07:36:59 crc kubenswrapper[4895]: I1202 07:36:59.374871 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-67fqr" podStartSLOduration=1.8505706 podStartE2EDuration="4.374845202s" podCreationTimestamp="2025-12-02 07:36:55 +0000 UTC" firstStartedPulling="2025-12-02 07:36:56.31595936 +0000 UTC m=+827.486818963" lastFinishedPulling="2025-12-02 07:36:58.840233952 +0000 UTC m=+830.011093565" observedRunningTime="2025-12-02 07:36:59.371194539 +0000 UTC m=+830.542054172" watchObservedRunningTime="2025-12-02 07:36:59.374845202 +0000 UTC m=+830.545704825" Dec 02 07:37:00 crc kubenswrapper[4895]: I1202 07:37:00.610058 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:37:00 crc kubenswrapper[4895]: I1202 07:37:00.610348 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xtwld" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="registry-server" containerID="cri-o://bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f" gracePeriod=2 Dec 02 07:37:00 crc kubenswrapper[4895]: I1202 07:37:00.953222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.069128 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities\") pod \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.069207 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg676\" (UniqueName: \"kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676\") pod \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.069350 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content\") pod \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\" (UID: \"23940dc6-2ca2-4c07-8a99-eeb2fcb48345\") " Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.070624 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities" (OuterVolumeSpecName: "utilities") pod "23940dc6-2ca2-4c07-8a99-eeb2fcb48345" (UID: "23940dc6-2ca2-4c07-8a99-eeb2fcb48345"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.070955 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.077818 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676" (OuterVolumeSpecName: "kube-api-access-cg676") pod "23940dc6-2ca2-4c07-8a99-eeb2fcb48345" (UID: "23940dc6-2ca2-4c07-8a99-eeb2fcb48345"). InnerVolumeSpecName "kube-api-access-cg676". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.171834 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg676\" (UniqueName: \"kubernetes.io/projected/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-kube-api-access-cg676\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.370908 4895 generic.go:334] "Generic (PLEG): container finished" podID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerID="bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f" exitCode=0 Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.370960 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerDied","Data":"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f"} Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.370990 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtwld" event={"ID":"23940dc6-2ca2-4c07-8a99-eeb2fcb48345","Type":"ContainerDied","Data":"29c5d34d66e5e083749be51df43e382aea7a0d2455b9539442d64904a33bed85"} Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.370964 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtwld" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.371017 4895 scope.go:117] "RemoveContainer" containerID="bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.386914 4895 scope.go:117] "RemoveContainer" containerID="a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.411704 4895 scope.go:117] "RemoveContainer" containerID="612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.425061 4895 scope.go:117] "RemoveContainer" containerID="bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f" Dec 02 07:37:01 crc kubenswrapper[4895]: E1202 07:37:01.425587 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f\": container with ID starting with bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f not found: ID does not exist" containerID="bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.425648 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f"} err="failed to get container status \"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f\": rpc error: code = NotFound desc = could not find container \"bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f\": container with ID starting with bda4e043ee7a0d2c86422593da96c67789d4f56d61eb78f4508e1e5d18171b0f not found: ID does not exist" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.425688 4895 scope.go:117] "RemoveContainer" containerID="a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b" Dec 02 07:37:01 crc kubenswrapper[4895]: E1202 07:37:01.426363 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b\": container with ID starting with a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b not found: ID does not exist" containerID="a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.426418 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b"} err="failed to get container status \"a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b\": rpc error: code = NotFound desc = could not find container \"a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b\": container with ID starting with a8b2c0c09bb68ace894abb2680feee050c5fdbd0631e24db69fb6880c6c70c8b not found: ID does not exist" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.426450 4895 scope.go:117] "RemoveContainer" containerID="612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35" Dec 02 07:37:01 crc kubenswrapper[4895]: E1202 07:37:01.426839 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35\": container with ID starting with 612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35 not found: ID does not exist" containerID="612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35" Dec 02 07:37:01 crc kubenswrapper[4895]: I1202 07:37:01.426873 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35"} err="failed to get container status \"612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35\": rpc error: code = NotFound desc = could not find container \"612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35\": container with ID starting with 612cd29006af7148b7fa3419037e784fd55f54eb081717d5ecee8066aa9ceb35 not found: ID does not exist" Dec 02 07:37:02 crc kubenswrapper[4895]: I1202 07:37:02.174061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23940dc6-2ca2-4c07-8a99-eeb2fcb48345" (UID: "23940dc6-2ca2-4c07-8a99-eeb2fcb48345"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:02 crc kubenswrapper[4895]: I1202 07:37:02.184845 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23940dc6-2ca2-4c07-8a99-eeb2fcb48345-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:02 crc kubenswrapper[4895]: I1202 07:37:02.309120 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:37:02 crc kubenswrapper[4895]: I1202 07:37:02.314016 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xtwld"] Dec 02 07:37:03 crc kubenswrapper[4895]: I1202 07:37:03.152954 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" path="/var/lib/kubelet/pods/23940dc6-2ca2-4c07-8a99-eeb2fcb48345/volumes" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.558709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2"] Dec 02 07:37:05 crc kubenswrapper[4895]: E1202 07:37:05.559032 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="extract-content" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.559052 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="extract-content" Dec 02 07:37:05 crc kubenswrapper[4895]: E1202 07:37:05.559073 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="registry-server" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.559082 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="registry-server" Dec 02 07:37:05 crc kubenswrapper[4895]: E1202 07:37:05.559091 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="extract-utilities" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.559101 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="extract-utilities" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.559212 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23940dc6-2ca2-4c07-8a99-eeb2fcb48345" containerName="registry-server" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.559848 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.563159 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-ccqlg" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.568327 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.569333 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.572495 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.589280 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.597791 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.614660 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lmvxq"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.615623 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.698548 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.699495 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.702153 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.702353 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.702808 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-f7dv2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.706859 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.732884 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7mb\" (UniqueName: \"kubernetes.io/projected/b59c4d17-fb33-40a9-b00f-fc89b30d9c6a-kube-api-access-7t7mb\") pod \"nmstate-metrics-7f946cbc9-55nh2\" (UID: \"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.732927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-nmstate-lock\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.733005 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-dbus-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.733043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.733078 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg68s\" (UniqueName: \"kubernetes.io/projected/19c56e99-f344-469b-9940-3e8ebe40c721-kube-api-access-zg68s\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.733105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-ovs-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.733697 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hrc\" (UniqueName: \"kubernetes.io/projected/b66d1e69-d965-457a-8a57-b5b721bc3cd9-kube-api-access-z7hrc\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.835522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.835934 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-dbus-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.835972 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836001 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg68s\" (UniqueName: \"kubernetes.io/projected/19c56e99-f344-469b-9940-3e8ebe40c721-kube-api-access-zg68s\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836021 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-ovs-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836057 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hrc\" (UniqueName: \"kubernetes.io/projected/b66d1e69-d965-457a-8a57-b5b721bc3cd9-kube-api-access-z7hrc\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836085 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7mb\" (UniqueName: \"kubernetes.io/projected/b59c4d17-fb33-40a9-b00f-fc89b30d9c6a-kube-api-access-7t7mb\") pod \"nmstate-metrics-7f946cbc9-55nh2\" (UID: \"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-nmstate-lock\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836161 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6fm\" (UniqueName: \"kubernetes.io/projected/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-kube-api-access-qb6fm\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: E1202 07:37:05.836251 4895 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836273 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-dbus-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836341 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-ovs-socket\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: E1202 07:37:05.836374 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair podName:19c56e99-f344-469b-9940-3e8ebe40c721 nodeName:}" failed. No retries permitted until 2025-12-02 07:37:06.33634567 +0000 UTC m=+837.507205343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-rfdrr" (UID: "19c56e99-f344-469b-9940-3e8ebe40c721") : secret "openshift-nmstate-webhook" not found Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.836585 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b66d1e69-d965-457a-8a57-b5b721bc3cd9-nmstate-lock\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.857116 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hrc\" (UniqueName: \"kubernetes.io/projected/b66d1e69-d965-457a-8a57-b5b721bc3cd9-kube-api-access-z7hrc\") pod \"nmstate-handler-lmvxq\" (UID: \"b66d1e69-d965-457a-8a57-b5b721bc3cd9\") " pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.859874 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7mb\" (UniqueName: \"kubernetes.io/projected/b59c4d17-fb33-40a9-b00f-fc89b30d9c6a-kube-api-access-7t7mb\") pod \"nmstate-metrics-7f946cbc9-55nh2\" (UID: \"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.861042 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg68s\" (UniqueName: \"kubernetes.io/projected/19c56e99-f344-469b-9940-3e8ebe40c721-kube-api-access-zg68s\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.874811 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.930331 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c54f8bf99-xcws6"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.931223 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.937316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.937387 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x97x\" (UniqueName: \"kubernetes.io/projected/8c190d66-b263-4c02-b342-ec0cbeb93433-kube-api-access-8x97x\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.937432 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-oauth-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.937456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-trusted-ca-bundle\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.937486 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-console-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.939762 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.940539 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.940586 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-service-ca\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.940627 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6fm\" (UniqueName: \"kubernetes.io/projected/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-kube-api-access-qb6fm\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.940665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.940700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-oauth-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.941652 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.955415 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c54f8bf99-xcws6"] Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.965810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:05 crc kubenswrapper[4895]: W1202 07:37:05.974833 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb66d1e69_d965_457a_8a57_b5b721bc3cd9.slice/crio-89b02a3252a4acb6b0e5eee34cf8d80d68bcc1d0465f32f2a34fc3ec011f2369 WatchSource:0}: Error finding container 89b02a3252a4acb6b0e5eee34cf8d80d68bcc1d0465f32f2a34fc3ec011f2369: Status 404 returned error can't find the container with id 89b02a3252a4acb6b0e5eee34cf8d80d68bcc1d0465f32f2a34fc3ec011f2369 Dec 02 07:37:05 crc kubenswrapper[4895]: I1202 07:37:05.988049 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6fm\" (UniqueName: \"kubernetes.io/projected/5f9c69d9-5599-4fc7-bebf-44167ee2cccf-kube-api-access-qb6fm\") pod \"nmstate-console-plugin-7fbb5f6569-t5dk9\" (UID: \"5f9c69d9-5599-4fc7-bebf-44167ee2cccf\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.036851 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.041948 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-service-ca\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042025 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-oauth-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042073 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042132 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x97x\" (UniqueName: \"kubernetes.io/projected/8c190d66-b263-4c02-b342-ec0cbeb93433-kube-api-access-8x97x\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042175 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-oauth-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042201 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-trusted-ca-bundle\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.042247 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-console-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.043301 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-console-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.044110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-service-ca\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.044768 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-oauth-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.048172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-serving-cert\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.051286 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c190d66-b263-4c02-b342-ec0cbeb93433-console-oauth-config\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.052280 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c190d66-b263-4c02-b342-ec0cbeb93433-trusted-ca-bundle\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.063566 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x97x\" (UniqueName: \"kubernetes.io/projected/8c190d66-b263-4c02-b342-ec0cbeb93433-kube-api-access-8x97x\") pod \"console-c54f8bf99-xcws6\" (UID: \"8c190d66-b263-4c02-b342-ec0cbeb93433\") " pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.121160 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2"] Dec 02 07:37:06 crc kubenswrapper[4895]: W1202 07:37:06.130364 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb59c4d17_fb33_40a9_b00f_fc89b30d9c6a.slice/crio-039a3947aa82676e2f3beb313b434795bca8cc3359ebba6f2c3d305962bd3277 WatchSource:0}: Error finding container 039a3947aa82676e2f3beb313b434795bca8cc3359ebba6f2c3d305962bd3277: Status 404 returned error can't find the container with id 039a3947aa82676e2f3beb313b434795bca8cc3359ebba6f2c3d305962bd3277 Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.246637 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9"] Dec 02 07:37:06 crc kubenswrapper[4895]: W1202 07:37:06.250191 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9c69d9_5599_4fc7_bebf_44167ee2cccf.slice/crio-b60833cf2c9af5e5f416f12a4916329980fbe43242e1e3399540124befaff10d WatchSource:0}: Error finding container b60833cf2c9af5e5f416f12a4916329980fbe43242e1e3399540124befaff10d: Status 404 returned error can't find the container with id b60833cf2c9af5e5f416f12a4916329980fbe43242e1e3399540124befaff10d Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.257476 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.346023 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.352136 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/19c56e99-f344-469b-9940-3e8ebe40c721-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rfdrr\" (UID: \"19c56e99-f344-469b-9940-3e8ebe40c721\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.402090 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" event={"ID":"5f9c69d9-5599-4fc7-bebf-44167ee2cccf","Type":"ContainerStarted","Data":"b60833cf2c9af5e5f416f12a4916329980fbe43242e1e3399540124befaff10d"} Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.403342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lmvxq" event={"ID":"b66d1e69-d965-457a-8a57-b5b721bc3cd9","Type":"ContainerStarted","Data":"89b02a3252a4acb6b0e5eee34cf8d80d68bcc1d0465f32f2a34fc3ec011f2369"} Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.404794 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" event={"ID":"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a","Type":"ContainerStarted","Data":"039a3947aa82676e2f3beb313b434795bca8cc3359ebba6f2c3d305962bd3277"} Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.453149 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c54f8bf99-xcws6"] Dec 02 07:37:06 crc kubenswrapper[4895]: W1202 07:37:06.459555 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c190d66_b263_4c02_b342_ec0cbeb93433.slice/crio-4f0408a0091d25aa4f02c59d350179026b3e04057520fc626fcf1676799ea4a7 WatchSource:0}: Error finding container 4f0408a0091d25aa4f02c59d350179026b3e04057520fc626fcf1676799ea4a7: Status 404 returned error can't find the container with id 4f0408a0091d25aa4f02c59d350179026b3e04057520fc626fcf1676799ea4a7 Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.485251 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:06 crc kubenswrapper[4895]: I1202 07:37:06.669499 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr"] Dec 02 07:37:06 crc kubenswrapper[4895]: W1202 07:37:06.675928 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c56e99_f344_469b_9940_3e8ebe40c721.slice/crio-9d5b00b0e2515fefa5039a6db10ada7ff551de64112b0fcb8b647298349e19fb WatchSource:0}: Error finding container 9d5b00b0e2515fefa5039a6db10ada7ff551de64112b0fcb8b647298349e19fb: Status 404 returned error can't find the container with id 9d5b00b0e2515fefa5039a6db10ada7ff551de64112b0fcb8b647298349e19fb Dec 02 07:37:07 crc kubenswrapper[4895]: I1202 07:37:07.411772 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" event={"ID":"19c56e99-f344-469b-9940-3e8ebe40c721","Type":"ContainerStarted","Data":"9d5b00b0e2515fefa5039a6db10ada7ff551de64112b0fcb8b647298349e19fb"} Dec 02 07:37:07 crc kubenswrapper[4895]: I1202 07:37:07.412797 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54f8bf99-xcws6" event={"ID":"8c190d66-b263-4c02-b342-ec0cbeb93433","Type":"ContainerStarted","Data":"4f0408a0091d25aa4f02c59d350179026b3e04057520fc626fcf1676799ea4a7"} Dec 02 07:37:08 crc kubenswrapper[4895]: I1202 07:37:08.419493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54f8bf99-xcws6" event={"ID":"8c190d66-b263-4c02-b342-ec0cbeb93433","Type":"ContainerStarted","Data":"9a8ae8c8516bccdf2474704ab47c25dc8494dddaf0aac01b36800608821379d0"} Dec 02 07:37:08 crc kubenswrapper[4895]: I1202 07:37:08.435965 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c54f8bf99-xcws6" podStartSLOduration=3.4359419 podStartE2EDuration="3.4359419s" podCreationTimestamp="2025-12-02 07:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:37:08.435878778 +0000 UTC m=+839.606738391" watchObservedRunningTime="2025-12-02 07:37:08.4359419 +0000 UTC m=+839.606801523" Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.427272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" event={"ID":"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a","Type":"ContainerStarted","Data":"4e7a7cd4952fc5b860632df5f378227b8707122afda14e291f9510fc156ebe5d"} Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.430154 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" event={"ID":"5f9c69d9-5599-4fc7-bebf-44167ee2cccf","Type":"ContainerStarted","Data":"f6bf6560d4a1a9225b734ffac3b8cc71c60e3735a99db94e6a08e203b9ba899a"} Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.433066 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" event={"ID":"19c56e99-f344-469b-9940-3e8ebe40c721","Type":"ContainerStarted","Data":"bf0040197613767bbde70898fae95fcee871d7c3809c5de5e5baf7baeb40e140"} Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.433170 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.446985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lmvxq" event={"ID":"b66d1e69-d965-457a-8a57-b5b721bc3cd9","Type":"ContainerStarted","Data":"cbd38f1ba8f4afbfbac3d16b7be83908d1af73d733695698eabfb8986097c540"} Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.447074 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.453444 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-t5dk9" podStartSLOduration=1.59717652 podStartE2EDuration="4.453420281s" podCreationTimestamp="2025-12-02 07:37:05 +0000 UTC" firstStartedPulling="2025-12-02 07:37:06.252576692 +0000 UTC m=+837.423436305" lastFinishedPulling="2025-12-02 07:37:09.108820453 +0000 UTC m=+840.279680066" observedRunningTime="2025-12-02 07:37:09.447986493 +0000 UTC m=+840.618846106" watchObservedRunningTime="2025-12-02 07:37:09.453420281 +0000 UTC m=+840.624279894" Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.509664 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" podStartSLOduration=2.073671745 podStartE2EDuration="4.509645559s" podCreationTimestamp="2025-12-02 07:37:05 +0000 UTC" firstStartedPulling="2025-12-02 07:37:06.679300508 +0000 UTC m=+837.850160121" lastFinishedPulling="2025-12-02 07:37:09.115274322 +0000 UTC m=+840.286133935" observedRunningTime="2025-12-02 07:37:09.488288828 +0000 UTC m=+840.659148441" watchObservedRunningTime="2025-12-02 07:37:09.509645559 +0000 UTC m=+840.680505172" Dec 02 07:37:09 crc kubenswrapper[4895]: I1202 07:37:09.510022 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lmvxq" podStartSLOduration=1.374452098 podStartE2EDuration="4.51001762s" podCreationTimestamp="2025-12-02 07:37:05 +0000 UTC" firstStartedPulling="2025-12-02 07:37:05.977854332 +0000 UTC m=+837.148713945" lastFinishedPulling="2025-12-02 07:37:09.113419854 +0000 UTC m=+840.284279467" observedRunningTime="2025-12-02 07:37:09.506485251 +0000 UTC m=+840.677344864" watchObservedRunningTime="2025-12-02 07:37:09.51001762 +0000 UTC m=+840.680877233" Dec 02 07:37:12 crc kubenswrapper[4895]: I1202 07:37:12.465940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" event={"ID":"b59c4d17-fb33-40a9-b00f-fc89b30d9c6a","Type":"ContainerStarted","Data":"5550aa52f4dae98f33b1a5a4f5c69846414a7c71aab2db166f98b127899821f0"} Dec 02 07:37:12 crc kubenswrapper[4895]: I1202 07:37:12.491572 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-55nh2" podStartSLOduration=2.29650809 podStartE2EDuration="7.491542932s" podCreationTimestamp="2025-12-02 07:37:05 +0000 UTC" firstStartedPulling="2025-12-02 07:37:06.132267524 +0000 UTC m=+837.303127137" lastFinishedPulling="2025-12-02 07:37:11.327302326 +0000 UTC m=+842.498161979" observedRunningTime="2025-12-02 07:37:12.489064016 +0000 UTC m=+843.659923639" watchObservedRunningTime="2025-12-02 07:37:12.491542932 +0000 UTC m=+843.662402555" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.202912 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.204727 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.219380 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.402252 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.402458 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.402528 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.503312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.503439 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.503466 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.503964 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.504041 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.525112 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z\") pod \"community-operators-dllp7\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.537862 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.825867 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 07:37:15 crc kubenswrapper[4895]: W1202 07:37:15.829876 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod880f2543_f0c5_4665_b094_13baea7fbf31.slice/crio-d8050c9161990da5fc241c1326b0e8c2101502c29f60891c952b7ceeb1bd37bf WatchSource:0}: Error finding container d8050c9161990da5fc241c1326b0e8c2101502c29f60891c952b7ceeb1bd37bf: Status 404 returned error can't find the container with id d8050c9161990da5fc241c1326b0e8c2101502c29f60891c952b7ceeb1bd37bf Dec 02 07:37:15 crc kubenswrapper[4895]: I1202 07:37:15.973297 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lmvxq" Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.258660 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.258731 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.263692 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.493181 4895 generic.go:334] "Generic (PLEG): container finished" podID="880f2543-f0c5-4665-b094-13baea7fbf31" containerID="3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb" exitCode=0 Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.493281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerDied","Data":"3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb"} Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.493577 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerStarted","Data":"d8050c9161990da5fc241c1326b0e8c2101502c29f60891c952b7ceeb1bd37bf"} Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.501626 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c54f8bf99-xcws6" Dec 02 07:37:16 crc kubenswrapper[4895]: I1202 07:37:16.576960 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:37:20 crc kubenswrapper[4895]: I1202 07:37:20.519836 4895 generic.go:334] "Generic (PLEG): container finished" podID="880f2543-f0c5-4665-b094-13baea7fbf31" containerID="9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5" exitCode=0 Dec 02 07:37:20 crc kubenswrapper[4895]: I1202 07:37:20.519898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerDied","Data":"9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5"} Dec 02 07:37:21 crc kubenswrapper[4895]: I1202 07:37:21.528716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerStarted","Data":"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485"} Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.538325 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.538997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.606500 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.631690 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dllp7" podStartSLOduration=6.176234709 podStartE2EDuration="10.631666357s" podCreationTimestamp="2025-12-02 07:37:15 +0000 UTC" firstStartedPulling="2025-12-02 07:37:16.495663068 +0000 UTC m=+847.666522681" lastFinishedPulling="2025-12-02 07:37:20.951094716 +0000 UTC m=+852.121954329" observedRunningTime="2025-12-02 07:37:21.553726023 +0000 UTC m=+852.724585676" watchObservedRunningTime="2025-12-02 07:37:25.631666357 +0000 UTC m=+856.802525980" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.857029 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.858965 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.865823 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.867733 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.867802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.867839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mld2s\" (UniqueName: \"kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.969180 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.969269 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mld2s\" (UniqueName: \"kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.969823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.970087 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.970324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:25 crc kubenswrapper[4895]: I1202 07:37:25.994865 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mld2s\" (UniqueName: \"kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s\") pod \"certified-operators-9vcfn\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:26 crc kubenswrapper[4895]: I1202 07:37:26.194800 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:26 crc kubenswrapper[4895]: I1202 07:37:26.494204 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rfdrr" Dec 02 07:37:26 crc kubenswrapper[4895]: I1202 07:37:26.841005 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:26 crc kubenswrapper[4895]: W1202 07:37:26.853900 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3af6b9c_e28a_4619_a26d_5c44adfbe129.slice/crio-07c8d56092d4cb28bc1575b9aef37b621209df8ac6c576a468c38733039d38cb WatchSource:0}: Error finding container 07c8d56092d4cb28bc1575b9aef37b621209df8ac6c576a468c38733039d38cb: Status 404 returned error can't find the container with id 07c8d56092d4cb28bc1575b9aef37b621209df8ac6c576a468c38733039d38cb Dec 02 07:37:27 crc kubenswrapper[4895]: I1202 07:37:27.591544 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerStarted","Data":"07c8d56092d4cb28bc1575b9aef37b621209df8ac6c576a468c38733039d38cb"} Dec 02 07:37:28 crc kubenswrapper[4895]: I1202 07:37:28.602926 4895 generic.go:334] "Generic (PLEG): container finished" podID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerID="6535b5a264f625d8925a6341592c8c201a1169b4c4119ab509a16adaf70cadcf" exitCode=0 Dec 02 07:37:28 crc kubenswrapper[4895]: I1202 07:37:28.603014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerDied","Data":"6535b5a264f625d8925a6341592c8c201a1169b4c4119ab509a16adaf70cadcf"} Dec 02 07:37:29 crc kubenswrapper[4895]: I1202 07:37:29.617235 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerStarted","Data":"540942d7def0a20b1b3a54f29e6921de54aa634533281e325c8cb560460879c6"} Dec 02 07:37:30 crc kubenswrapper[4895]: I1202 07:37:30.627777 4895 generic.go:334] "Generic (PLEG): container finished" podID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerID="540942d7def0a20b1b3a54f29e6921de54aa634533281e325c8cb560460879c6" exitCode=0 Dec 02 07:37:30 crc kubenswrapper[4895]: I1202 07:37:30.627867 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerDied","Data":"540942d7def0a20b1b3a54f29e6921de54aa634533281e325c8cb560460879c6"} Dec 02 07:37:33 crc kubenswrapper[4895]: I1202 07:37:33.661434 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerStarted","Data":"c5932931272af7e62a7932037155c7aa7aded4a94b97ed205a0b455b3e45bcaa"} Dec 02 07:37:33 crc kubenswrapper[4895]: I1202 07:37:33.738966 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vcfn" podStartSLOduration=4.154306865 podStartE2EDuration="8.73893584s" podCreationTimestamp="2025-12-02 07:37:25 +0000 UTC" firstStartedPulling="2025-12-02 07:37:28.605397693 +0000 UTC m=+859.776257306" lastFinishedPulling="2025-12-02 07:37:33.190026668 +0000 UTC m=+864.360886281" observedRunningTime="2025-12-02 07:37:33.732116068 +0000 UTC m=+864.902975681" watchObservedRunningTime="2025-12-02 07:37:33.73893584 +0000 UTC m=+864.909795453" Dec 02 07:37:35 crc kubenswrapper[4895]: I1202 07:37:35.579652 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dllp7" Dec 02 07:37:35 crc kubenswrapper[4895]: I1202 07:37:35.672411 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 07:37:35 crc kubenswrapper[4895]: I1202 07:37:35.706158 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:37:35 crc kubenswrapper[4895]: I1202 07:37:35.706451 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nd26q" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="registry-server" containerID="cri-o://5dafaa11de45e41d6ed7382cf66d08eef5bd6b19209beb2a56dc4e1c1844ec50" gracePeriod=2 Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.195716 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.196117 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.257445 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.685779 4895 generic.go:334] "Generic (PLEG): container finished" podID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerID="5dafaa11de45e41d6ed7382cf66d08eef5bd6b19209beb2a56dc4e1c1844ec50" exitCode=0 Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.686757 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerDied","Data":"5dafaa11de45e41d6ed7382cf66d08eef5bd6b19209beb2a56dc4e1c1844ec50"} Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.776104 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.883734 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities\") pod \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.883848 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content\") pod \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.884018 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l57k\" (UniqueName: \"kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k\") pod \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\" (UID: \"15e19f4e-9f62-4b48-be1e-6aaab358c5d4\") " Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.884621 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities" (OuterVolumeSpecName: "utilities") pod "15e19f4e-9f62-4b48-be1e-6aaab358c5d4" (UID: "15e19f4e-9f62-4b48-be1e-6aaab358c5d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.891132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k" (OuterVolumeSpecName: "kube-api-access-6l57k") pod "15e19f4e-9f62-4b48-be1e-6aaab358c5d4" (UID: "15e19f4e-9f62-4b48-be1e-6aaab358c5d4"). InnerVolumeSpecName "kube-api-access-6l57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.935140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15e19f4e-9f62-4b48-be1e-6aaab358c5d4" (UID: "15e19f4e-9f62-4b48-be1e-6aaab358c5d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.985508 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l57k\" (UniqueName: \"kubernetes.io/projected/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-kube-api-access-6l57k\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.985544 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:36 crc kubenswrapper[4895]: I1202 07:37:36.985557 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15e19f4e-9f62-4b48-be1e-6aaab358c5d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.697186 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nd26q" Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.697134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nd26q" event={"ID":"15e19f4e-9f62-4b48-be1e-6aaab358c5d4","Type":"ContainerDied","Data":"b1c43e63f27f333480d2d928e12a2e20da8e2892c48a1079b04facebc4bc352c"} Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.697559 4895 scope.go:117] "RemoveContainer" containerID="5dafaa11de45e41d6ed7382cf66d08eef5bd6b19209beb2a56dc4e1c1844ec50" Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.722505 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.744953 4895 scope.go:117] "RemoveContainer" containerID="530922f89c90faccae2745204ad33c4cb8c17d8da1a820532bfc63d8ad5e7fc7" Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.749559 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nd26q"] Dec 02 07:37:37 crc kubenswrapper[4895]: I1202 07:37:37.766111 4895 scope.go:117] "RemoveContainer" containerID="21a9cbd4f72db81b60a7d8a329b43e15d14442c4e102fbff1243abbaa330b9d2" Dec 02 07:37:39 crc kubenswrapper[4895]: I1202 07:37:39.147712 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" path="/var/lib/kubelet/pods/15e19f4e-9f62-4b48-be1e-6aaab358c5d4/volumes" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.467051 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw"] Dec 02 07:37:41 crc kubenswrapper[4895]: E1202 07:37:41.467608 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="registry-server" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.467622 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="registry-server" Dec 02 07:37:41 crc kubenswrapper[4895]: E1202 07:37:41.467644 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="extract-utilities" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.467650 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="extract-utilities" Dec 02 07:37:41 crc kubenswrapper[4895]: E1202 07:37:41.467659 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="extract-content" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.467666 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="extract-content" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.467792 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e19f4e-9f62-4b48-be1e-6aaab358c5d4" containerName="registry-server" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.468561 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.470328 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.478512 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw"] Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.553933 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jgmh\" (UniqueName: \"kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.554012 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.554042 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.625466 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q7mhm" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" containerID="cri-o://98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42" gracePeriod=15 Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.655913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jgmh\" (UniqueName: \"kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.656034 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.656692 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.656766 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.657015 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.679050 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jgmh\" (UniqueName: \"kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:41 crc kubenswrapper[4895]: I1202 07:37:41.813272 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.104870 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw"] Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.473093 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q7mhm_ddcbf4b8-5804-4136-8554-6a307825a6ed/console/0.log" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.473426 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571200 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66c26\" (UniqueName: \"kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571250 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571329 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571367 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571455 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.571483 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle\") pod \"ddcbf4b8-5804-4136-8554-6a307825a6ed\" (UID: \"ddcbf4b8-5804-4136-8554-6a307825a6ed\") " Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.572471 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca" (OuterVolumeSpecName: "service-ca") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.572491 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config" (OuterVolumeSpecName: "console-config") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.572517 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.573012 4895 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.573031 4895 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.573044 4895 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.573107 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.576636 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.576901 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26" (OuterVolumeSpecName: "kube-api-access-66c26") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "kube-api-access-66c26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.578930 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ddcbf4b8-5804-4136-8554-6a307825a6ed" (UID: "ddcbf4b8-5804-4136-8554-6a307825a6ed"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.674353 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66c26\" (UniqueName: \"kubernetes.io/projected/ddcbf4b8-5804-4136-8554-6a307825a6ed-kube-api-access-66c26\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.674381 4895 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.674390 4895 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcbf4b8-5804-4136-8554-6a307825a6ed-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.674434 4895 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcbf4b8-5804-4136-8554-6a307825a6ed-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729810 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q7mhm_ddcbf4b8-5804-4136-8554-6a307825a6ed/console/0.log" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729863 4895 generic.go:334] "Generic (PLEG): container finished" podID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerID="98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42" exitCode=2 Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729932 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q7mhm" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729936 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q7mhm" event={"ID":"ddcbf4b8-5804-4136-8554-6a307825a6ed","Type":"ContainerDied","Data":"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42"} Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729969 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q7mhm" event={"ID":"ddcbf4b8-5804-4136-8554-6a307825a6ed","Type":"ContainerDied","Data":"90a8378c6c7b223b8b65bf85ecfd66e0fd00d4110b86a8ed69270d49e396428e"} Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.729987 4895 scope.go:117] "RemoveContainer" containerID="98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.732937 4895 generic.go:334] "Generic (PLEG): container finished" podID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerID="8c1955c803927efc410967dfade9c213c71848b24fc1a073775e087f88755c0d" exitCode=0 Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.732985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" event={"ID":"4ffd4159-d58c-4f5b-aa31-bb4e81790c51","Type":"ContainerDied","Data":"8c1955c803927efc410967dfade9c213c71848b24fc1a073775e087f88755c0d"} Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.733015 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" event={"ID":"4ffd4159-d58c-4f5b-aa31-bb4e81790c51","Type":"ContainerStarted","Data":"f548cf56214114940283f30036cbfcd792cc4e0c7be13c6a46b6b8639c4e4198"} Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.767802 4895 scope.go:117] "RemoveContainer" containerID="98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42" Dec 02 07:37:42 crc kubenswrapper[4895]: E1202 07:37:42.768319 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42\": container with ID starting with 98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42 not found: ID does not exist" containerID="98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.768373 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42"} err="failed to get container status \"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42\": rpc error: code = NotFound desc = could not find container \"98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42\": container with ID starting with 98a239b08ccb243a25b395e480fcdad1c50fd6ccca28a9661e3ba9f2b3c65d42 not found: ID does not exist" Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.770920 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:37:42 crc kubenswrapper[4895]: I1202 07:37:42.780116 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q7mhm"] Dec 02 07:37:43 crc kubenswrapper[4895]: I1202 07:37:43.155583 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" path="/var/lib/kubelet/pods/ddcbf4b8-5804-4136-8554-6a307825a6ed/volumes" Dec 02 07:37:43 crc kubenswrapper[4895]: I1202 07:37:43.402595 4895 patch_prober.go:28] interesting pod/console-f9d7485db-q7mhm container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:37:43 crc kubenswrapper[4895]: I1202 07:37:43.402680 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-q7mhm" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:37:44 crc kubenswrapper[4895]: I1202 07:37:44.752267 4895 generic.go:334] "Generic (PLEG): container finished" podID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerID="41f77c3e9840472d019526e72113615068d6210a25c714c69becddbd1082e8f6" exitCode=0 Dec 02 07:37:44 crc kubenswrapper[4895]: I1202 07:37:44.752406 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" event={"ID":"4ffd4159-d58c-4f5b-aa31-bb4e81790c51","Type":"ContainerDied","Data":"41f77c3e9840472d019526e72113615068d6210a25c714c69becddbd1082e8f6"} Dec 02 07:37:45 crc kubenswrapper[4895]: I1202 07:37:45.760649 4895 generic.go:334] "Generic (PLEG): container finished" podID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerID="2c28328f57ae9e24944155f1159f30f6778cd1f0509e847d1466326b001792af" exitCode=0 Dec 02 07:37:45 crc kubenswrapper[4895]: I1202 07:37:45.760773 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" event={"ID":"4ffd4159-d58c-4f5b-aa31-bb4e81790c51","Type":"ContainerDied","Data":"2c28328f57ae9e24944155f1159f30f6778cd1f0509e847d1466326b001792af"} Dec 02 07:37:46 crc kubenswrapper[4895]: I1202 07:37:46.251999 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.060696 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.143849 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle\") pod \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.143995 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jgmh\" (UniqueName: \"kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh\") pod \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.144115 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util\") pod \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\" (UID: \"4ffd4159-d58c-4f5b-aa31-bb4e81790c51\") " Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.145072 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle" (OuterVolumeSpecName: "bundle") pod "4ffd4159-d58c-4f5b-aa31-bb4e81790c51" (UID: "4ffd4159-d58c-4f5b-aa31-bb4e81790c51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.151956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh" (OuterVolumeSpecName: "kube-api-access-9jgmh") pod "4ffd4159-d58c-4f5b-aa31-bb4e81790c51" (UID: "4ffd4159-d58c-4f5b-aa31-bb4e81790c51"). InnerVolumeSpecName "kube-api-access-9jgmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.167500 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util" (OuterVolumeSpecName: "util") pod "4ffd4159-d58c-4f5b-aa31-bb4e81790c51" (UID: "4ffd4159-d58c-4f5b-aa31-bb4e81790c51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.245888 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.245917 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.245926 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jgmh\" (UniqueName: \"kubernetes.io/projected/4ffd4159-d58c-4f5b-aa31-bb4e81790c51-kube-api-access-9jgmh\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.775565 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" event={"ID":"4ffd4159-d58c-4f5b-aa31-bb4e81790c51","Type":"ContainerDied","Data":"f548cf56214114940283f30036cbfcd792cc4e0c7be13c6a46b6b8639c4e4198"} Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.775616 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f548cf56214114940283f30036cbfcd792cc4e0c7be13c6a46b6b8639c4e4198" Dec 02 07:37:47 crc kubenswrapper[4895]: I1202 07:37:47.775647 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw" Dec 02 07:37:50 crc kubenswrapper[4895]: I1202 07:37:50.624297 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:50 crc kubenswrapper[4895]: I1202 07:37:50.625566 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vcfn" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="registry-server" containerID="cri-o://c5932931272af7e62a7932037155c7aa7aded4a94b97ed205a0b455b3e45bcaa" gracePeriod=2 Dec 02 07:37:50 crc kubenswrapper[4895]: I1202 07:37:50.798496 4895 generic.go:334] "Generic (PLEG): container finished" podID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerID="c5932931272af7e62a7932037155c7aa7aded4a94b97ed205a0b455b3e45bcaa" exitCode=0 Dec 02 07:37:50 crc kubenswrapper[4895]: I1202 07:37:50.798545 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerDied","Data":"c5932931272af7e62a7932037155c7aa7aded4a94b97ed205a0b455b3e45bcaa"} Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.233351 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.302308 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mld2s\" (UniqueName: \"kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s\") pod \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.302440 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content\") pod \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.302468 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities\") pod \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\" (UID: \"f3af6b9c-e28a-4619-a26d-5c44adfbe129\") " Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.303581 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities" (OuterVolumeSpecName: "utilities") pod "f3af6b9c-e28a-4619-a26d-5c44adfbe129" (UID: "f3af6b9c-e28a-4619-a26d-5c44adfbe129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.308466 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s" (OuterVolumeSpecName: "kube-api-access-mld2s") pod "f3af6b9c-e28a-4619-a26d-5c44adfbe129" (UID: "f3af6b9c-e28a-4619-a26d-5c44adfbe129"). InnerVolumeSpecName "kube-api-access-mld2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.350669 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3af6b9c-e28a-4619-a26d-5c44adfbe129" (UID: "f3af6b9c-e28a-4619-a26d-5c44adfbe129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.404081 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mld2s\" (UniqueName: \"kubernetes.io/projected/f3af6b9c-e28a-4619-a26d-5c44adfbe129-kube-api-access-mld2s\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.404137 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.404150 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3af6b9c-e28a-4619-a26d-5c44adfbe129-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.807366 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vcfn" event={"ID":"f3af6b9c-e28a-4619-a26d-5c44adfbe129","Type":"ContainerDied","Data":"07c8d56092d4cb28bc1575b9aef37b621209df8ac6c576a468c38733039d38cb"} Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.807441 4895 scope.go:117] "RemoveContainer" containerID="c5932931272af7e62a7932037155c7aa7aded4a94b97ed205a0b455b3e45bcaa" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.808722 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vcfn" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.833763 4895 scope.go:117] "RemoveContainer" containerID="540942d7def0a20b1b3a54f29e6921de54aa634533281e325c8cb560460879c6" Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.846102 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.849779 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vcfn"] Dec 02 07:37:51 crc kubenswrapper[4895]: I1202 07:37:51.856445 4895 scope.go:117] "RemoveContainer" containerID="6535b5a264f625d8925a6341592c8c201a1169b4c4119ab509a16adaf70cadcf" Dec 02 07:37:53 crc kubenswrapper[4895]: I1202 07:37:53.149878 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" path="/var/lib/kubelet/pods/f3af6b9c-e28a-4619-a26d-5c44adfbe129/volumes" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.946380 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947077 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="registry-server" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947094 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="registry-server" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947108 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="util" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947115 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="util" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947123 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="pull" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947130 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="pull" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947138 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="extract" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947146 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="extract" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947156 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="extract-content" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947165 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="extract-content" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947175 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947182 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" Dec 02 07:37:55 crc kubenswrapper[4895]: E1202 07:37:55.947196 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="extract-utilities" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947204 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="extract-utilities" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947330 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3af6b9c-e28a-4619-a26d-5c44adfbe129" containerName="registry-server" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947350 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffd4159-d58c-4f5b-aa31-bb4e81790c51" containerName="extract" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.947364 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcbf4b8-5804-4136-8554-6a307825a6ed" containerName="console" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.948401 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:55 crc kubenswrapper[4895]: I1202 07:37:55.963712 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.071502 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.071580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.071644 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5tkb\" (UniqueName: \"kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.172467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5tkb\" (UniqueName: \"kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.172566 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.172619 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.173090 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.173097 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.208801 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5tkb\" (UniqueName: \"kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb\") pod \"redhat-marketplace-l4x5r\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.265243 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.407439 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.408486 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.418135 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.418409 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.418529 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.419021 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9knp4" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.419187 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.430091 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.476613 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9qp\" (UniqueName: \"kubernetes.io/projected/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-kube-api-access-vc9qp\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.476700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-apiservice-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.476735 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-webhook-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.577775 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9qp\" (UniqueName: \"kubernetes.io/projected/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-kube-api-access-vc9qp\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.577862 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-apiservice-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.577900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-webhook-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.583836 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-apiservice-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.584068 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.594136 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-webhook-cert\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.606241 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9qp\" (UniqueName: \"kubernetes.io/projected/50763e18-0c0b-4aff-97bc-0fb2fdce0b0b-kube-api-access-vc9qp\") pod \"metallb-operator-controller-manager-5649cff7-xm5vt\" (UID: \"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b\") " pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.743783 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.760343 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-659846876d-j9nsg"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.761779 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.765524 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.765755 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.766432 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hbl5f" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.787453 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-659846876d-j9nsg"] Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.840775 4895 generic.go:334] "Generic (PLEG): container finished" podID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerID="cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed" exitCode=0 Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.840824 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerDied","Data":"cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed"} Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.840854 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerStarted","Data":"c08b5573c2939eebfbaa944413ebacef7b7e8313174425269dc066d8c16fda08"} Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.885465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-apiservice-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.885583 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgvm\" (UniqueName: \"kubernetes.io/projected/bbabc224-1401-4730-a4d7-92caa322c81b-kube-api-access-rvgvm\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.885622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-webhook-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.986991 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-webhook-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.987052 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-apiservice-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.987195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgvm\" (UniqueName: \"kubernetes.io/projected/bbabc224-1401-4730-a4d7-92caa322c81b-kube-api-access-rvgvm\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.992970 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-apiservice-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:56 crc kubenswrapper[4895]: I1202 07:37:56.993339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbabc224-1401-4730-a4d7-92caa322c81b-webhook-cert\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.008467 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgvm\" (UniqueName: \"kubernetes.io/projected/bbabc224-1401-4730-a4d7-92caa322c81b-kube-api-access-rvgvm\") pod \"metallb-operator-webhook-server-659846876d-j9nsg\" (UID: \"bbabc224-1401-4730-a4d7-92caa322c81b\") " pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.069604 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt"] Dec 02 07:37:57 crc kubenswrapper[4895]: W1202 07:37:57.073280 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50763e18_0c0b_4aff_97bc_0fb2fdce0b0b.slice/crio-454f0298214a551377178c7965d0e5b4a8fc0018aeb2196faf7053f64838fde1 WatchSource:0}: Error finding container 454f0298214a551377178c7965d0e5b4a8fc0018aeb2196faf7053f64838fde1: Status 404 returned error can't find the container with id 454f0298214a551377178c7965d0e5b4a8fc0018aeb2196faf7053f64838fde1 Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.074797 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:37:57 crc kubenswrapper[4895]: W1202 07:37:57.309730 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbabc224_1401_4730_a4d7_92caa322c81b.slice/crio-168b42c45f311b34a710532fd0df9ac8a72584adaa3d081b8aa19b482533c98b WatchSource:0}: Error finding container 168b42c45f311b34a710532fd0df9ac8a72584adaa3d081b8aa19b482533c98b: Status 404 returned error can't find the container with id 168b42c45f311b34a710532fd0df9ac8a72584adaa3d081b8aa19b482533c98b Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.318048 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-659846876d-j9nsg"] Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.849822 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" event={"ID":"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b","Type":"ContainerStarted","Data":"454f0298214a551377178c7965d0e5b4a8fc0018aeb2196faf7053f64838fde1"} Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.851098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" event={"ID":"bbabc224-1401-4730-a4d7-92caa322c81b","Type":"ContainerStarted","Data":"168b42c45f311b34a710532fd0df9ac8a72584adaa3d081b8aa19b482533c98b"} Dec 02 07:37:57 crc kubenswrapper[4895]: I1202 07:37:57.856330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerStarted","Data":"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc"} Dec 02 07:37:58 crc kubenswrapper[4895]: I1202 07:37:58.878014 4895 generic.go:334] "Generic (PLEG): container finished" podID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerID="0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc" exitCode=0 Dec 02 07:37:58 crc kubenswrapper[4895]: I1202 07:37:58.878069 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerDied","Data":"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc"} Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.199470 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" event={"ID":"50763e18-0c0b-4aff-97bc-0fb2fdce0b0b","Type":"ContainerStarted","Data":"f5367ccb58b1fc778cd2312ad71626df09a890dcfa1e6b7cf49853d852ef5c21"} Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.200000 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.201231 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" event={"ID":"bbabc224-1401-4730-a4d7-92caa322c81b","Type":"ContainerStarted","Data":"493d4cd438ac2b86d7ab21afa123a261bf36531fab8aec9257caacdd64cef80e"} Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.201417 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.203353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerStarted","Data":"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b"} Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.222980 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" podStartSLOduration=1.876983385 podStartE2EDuration="9.222957227s" podCreationTimestamp="2025-12-02 07:37:56 +0000 UTC" firstStartedPulling="2025-12-02 07:37:57.076639263 +0000 UTC m=+888.247498876" lastFinishedPulling="2025-12-02 07:38:04.422613065 +0000 UTC m=+895.593472718" observedRunningTime="2025-12-02 07:38:05.221028927 +0000 UTC m=+896.391888540" watchObservedRunningTime="2025-12-02 07:38:05.222957227 +0000 UTC m=+896.393816840" Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.248396 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l4x5r" podStartSLOduration=2.67083308 podStartE2EDuration="10.248358825s" podCreationTimestamp="2025-12-02 07:37:55 +0000 UTC" firstStartedPulling="2025-12-02 07:37:56.842821169 +0000 UTC m=+888.013680782" lastFinishedPulling="2025-12-02 07:38:04.420346914 +0000 UTC m=+895.591206527" observedRunningTime="2025-12-02 07:38:05.2450062 +0000 UTC m=+896.415865813" watchObservedRunningTime="2025-12-02 07:38:05.248358825 +0000 UTC m=+896.419218438" Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.267147 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" podStartSLOduration=2.146869588 podStartE2EDuration="9.267128357s" podCreationTimestamp="2025-12-02 07:37:56 +0000 UTC" firstStartedPulling="2025-12-02 07:37:57.313324606 +0000 UTC m=+888.484184219" lastFinishedPulling="2025-12-02 07:38:04.433583375 +0000 UTC m=+895.604442988" observedRunningTime="2025-12-02 07:38:05.262353699 +0000 UTC m=+896.433213332" watchObservedRunningTime="2025-12-02 07:38:05.267128357 +0000 UTC m=+896.437987970" Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.473611 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:38:05 crc kubenswrapper[4895]: I1202 07:38:05.473682 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:38:06 crc kubenswrapper[4895]: I1202 07:38:06.266142 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:06 crc kubenswrapper[4895]: I1202 07:38:06.266211 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:06 crc kubenswrapper[4895]: I1202 07:38:06.403297 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:16 crc kubenswrapper[4895]: I1202 07:38:16.314323 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:16 crc kubenswrapper[4895]: I1202 07:38:16.711319 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:38:17 crc kubenswrapper[4895]: I1202 07:38:17.084489 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-659846876d-j9nsg" Dec 02 07:38:17 crc kubenswrapper[4895]: I1202 07:38:17.286445 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l4x5r" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="registry-server" containerID="cri-o://8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b" gracePeriod=2 Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.220613 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.299489 4895 generic.go:334] "Generic (PLEG): container finished" podID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerID="8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b" exitCode=0 Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.299534 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerDied","Data":"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b"} Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.299572 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l4x5r" event={"ID":"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408","Type":"ContainerDied","Data":"c08b5573c2939eebfbaa944413ebacef7b7e8313174425269dc066d8c16fda08"} Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.299593 4895 scope.go:117] "RemoveContainer" containerID="8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.299632 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l4x5r" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.325200 4895 scope.go:117] "RemoveContainer" containerID="0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.339627 4895 scope.go:117] "RemoveContainer" containerID="cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.358989 4895 scope.go:117] "RemoveContainer" containerID="8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b" Dec 02 07:38:18 crc kubenswrapper[4895]: E1202 07:38:18.359513 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b\": container with ID starting with 8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b not found: ID does not exist" containerID="8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.359579 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b"} err="failed to get container status \"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b\": rpc error: code = NotFound desc = could not find container \"8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b\": container with ID starting with 8255188eca0fe8f8221b8e04def0b311a4110b363fb6c54dd879de484e75776b not found: ID does not exist" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.359615 4895 scope.go:117] "RemoveContainer" containerID="0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc" Dec 02 07:38:18 crc kubenswrapper[4895]: E1202 07:38:18.359948 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc\": container with ID starting with 0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc not found: ID does not exist" containerID="0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.359979 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc"} err="failed to get container status \"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc\": rpc error: code = NotFound desc = could not find container \"0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc\": container with ID starting with 0922c9a7aff6c9d18c8f4c4702035652b61582d5eeee1261612db4edf322d3fc not found: ID does not exist" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.360004 4895 scope.go:117] "RemoveContainer" containerID="cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed" Dec 02 07:38:18 crc kubenswrapper[4895]: E1202 07:38:18.360223 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed\": container with ID starting with cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed not found: ID does not exist" containerID="cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.360246 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed"} err="failed to get container status \"cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed\": rpc error: code = NotFound desc = could not find container \"cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed\": container with ID starting with cd6ad219dde46dc3f9e5256ed51ce12678e0b678f5d479d6bfb31d885afd12ed not found: ID does not exist" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.375969 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content\") pod \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.376052 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5tkb\" (UniqueName: \"kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb\") pod \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.376130 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities\") pod \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\" (UID: \"8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408\") " Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.377213 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities" (OuterVolumeSpecName: "utilities") pod "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" (UID: "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.381727 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb" (OuterVolumeSpecName: "kube-api-access-j5tkb") pod "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" (UID: "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408"). InnerVolumeSpecName "kube-api-access-j5tkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.393865 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" (UID: "8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.477887 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5tkb\" (UniqueName: \"kubernetes.io/projected/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-kube-api-access-j5tkb\") on node \"crc\" DevicePath \"\"" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.477932 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.477944 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.636762 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:38:18 crc kubenswrapper[4895]: I1202 07:38:18.642856 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l4x5r"] Dec 02 07:38:19 crc kubenswrapper[4895]: I1202 07:38:19.148878 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" path="/var/lib/kubelet/pods/8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408/volumes" Dec 02 07:38:35 crc kubenswrapper[4895]: I1202 07:38:35.473531 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:38:35 crc kubenswrapper[4895]: I1202 07:38:35.474097 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:38:36 crc kubenswrapper[4895]: I1202 07:38:36.746883 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5649cff7-xm5vt" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.434902 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hjl4p"] Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.435548 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="registry-server" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.435573 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="registry-server" Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.435592 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="extract-content" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.435600 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="extract-content" Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.435612 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="extract-utilities" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.435620 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="extract-utilities" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.435794 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9bd6f8-45cf-4f28-88b6-4eb2a4c80408" containerName="registry-server" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.441579 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch"] Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.442294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.442942 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.448093 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.448654 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.448694 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.448930 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-hxhf9" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.456510 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch"] Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.519121 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tgv9h"] Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.520131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.522831 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.522894 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mf2km" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.523515 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.525157 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.576279 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-mpkw4"] Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.580553 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.585369 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.587545 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-mpkw4"] Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.610454 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bef27c67-d150-4004-bfbf-285c544f72f7-metrics-certs\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.610531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-sockets\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.610561 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-reloader\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.610605 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6dg\" (UniqueName: \"kubernetes.io/projected/bef27c67-d150-4004-bfbf-285c544f72f7-kube-api-access-hh6dg\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.611647 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-metrics\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.611713 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdtq\" (UniqueName: \"kubernetes.io/projected/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-kube-api-access-gfdtq\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.611817 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bef27c67-d150-4004-bfbf-285c544f72f7-frr-startup\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.611850 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-conf\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.611883 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713019 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bef27c67-d150-4004-bfbf-285c544f72f7-frr-startup\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-conf\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713113 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713151 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bef27c67-d150-4004-bfbf-285c544f72f7-metrics-certs\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713190 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metallb-excludel2\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713222 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-sockets\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-reloader\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-metrics-certs\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713307 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metrics-certs\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6dg\" (UniqueName: \"kubernetes.io/projected/bef27c67-d150-4004-bfbf-285c544f72f7-kube-api-access-hh6dg\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713362 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713386 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-cert\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713412 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrg8\" (UniqueName: \"kubernetes.io/projected/1a93aadf-71d0-4a54-8eb1-fd710b164b07-kube-api-access-msrg8\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713434 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-metrics\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59fd\" (UniqueName: \"kubernetes.io/projected/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-kube-api-access-c59fd\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713483 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdtq\" (UniqueName: \"kubernetes.io/projected/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-kube-api-access-gfdtq\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.713649 4895 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.713757 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-conf\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.713778 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert podName:d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af nodeName:}" failed. No retries permitted until 2025-12-02 07:38:38.213735047 +0000 UTC m=+929.384594660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert") pod "frr-k8s-webhook-server-7fcb986d4-h27ch" (UID: "d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af") : secret "frr-k8s-webhook-server-cert" not found Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.714137 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-metrics\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.714229 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-reloader\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.714469 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bef27c67-d150-4004-bfbf-285c544f72f7-frr-sockets\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.715453 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bef27c67-d150-4004-bfbf-285c544f72f7-frr-startup\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.728512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bef27c67-d150-4004-bfbf-285c544f72f7-metrics-certs\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.737247 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6dg\" (UniqueName: \"kubernetes.io/projected/bef27c67-d150-4004-bfbf-285c544f72f7-kube-api-access-hh6dg\") pod \"frr-k8s-hjl4p\" (UID: \"bef27c67-d150-4004-bfbf-285c544f72f7\") " pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.738895 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdtq\" (UniqueName: \"kubernetes.io/projected/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-kube-api-access-gfdtq\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.772114 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.816873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-metrics-certs\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.816921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metrics-certs\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.816946 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.816966 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-cert\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.816989 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrg8\" (UniqueName: \"kubernetes.io/projected/1a93aadf-71d0-4a54-8eb1-fd710b164b07-kube-api-access-msrg8\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.817010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59fd\" (UniqueName: \"kubernetes.io/projected/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-kube-api-access-c59fd\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.817072 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metallb-excludel2\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.817783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metallb-excludel2\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.818967 4895 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 07:38:37 crc kubenswrapper[4895]: E1202 07:38:37.819049 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist podName:1a93aadf-71d0-4a54-8eb1-fd710b164b07 nodeName:}" failed. No retries permitted until 2025-12-02 07:38:38.319029284 +0000 UTC m=+929.489888897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist") pod "speaker-tgv9h" (UID: "1a93aadf-71d0-4a54-8eb1-fd710b164b07") : secret "metallb-memberlist" not found Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.821174 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-metrics-certs\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.821675 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-metrics-certs\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.824163 4895 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.833320 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-cert\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.838672 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59fd\" (UniqueName: \"kubernetes.io/projected/b2bd91a3-f8a0-4abe-9598-4977cc56daa1-kube-api-access-c59fd\") pod \"controller-f8648f98b-mpkw4\" (UID: \"b2bd91a3-f8a0-4abe-9598-4977cc56daa1\") " pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.841044 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrg8\" (UniqueName: \"kubernetes.io/projected/1a93aadf-71d0-4a54-8eb1-fd710b164b07-kube-api-access-msrg8\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:37 crc kubenswrapper[4895]: I1202 07:38:37.898127 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.224427 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.230784 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-h27ch\" (UID: \"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.320804 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-mpkw4"] Dec 02 07:38:38 crc kubenswrapper[4895]: W1202 07:38:38.324890 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2bd91a3_f8a0_4abe_9598_4977cc56daa1.slice/crio-0e7bb2dd23b887ed56a9faad399db4c7051695d29d58bba1ffaf1b8d10d57b40 WatchSource:0}: Error finding container 0e7bb2dd23b887ed56a9faad399db4c7051695d29d58bba1ffaf1b8d10d57b40: Status 404 returned error can't find the container with id 0e7bb2dd23b887ed56a9faad399db4c7051695d29d58bba1ffaf1b8d10d57b40 Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.325323 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:38 crc kubenswrapper[4895]: E1202 07:38:38.325500 4895 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 07:38:38 crc kubenswrapper[4895]: E1202 07:38:38.325564 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist podName:1a93aadf-71d0-4a54-8eb1-fd710b164b07 nodeName:}" failed. No retries permitted until 2025-12-02 07:38:39.325545029 +0000 UTC m=+930.496404642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist") pod "speaker-tgv9h" (UID: "1a93aadf-71d0-4a54-8eb1-fd710b164b07") : secret "metallb-memberlist" not found Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.361780 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.422236 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"f4246d8e7cb7a322551034eb1b0ce165eda70e4e9aa1b2a7834aab9c88de9779"} Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.423217 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mpkw4" event={"ID":"b2bd91a3-f8a0-4abe-9598-4977cc56daa1","Type":"ContainerStarted","Data":"0e7bb2dd23b887ed56a9faad399db4c7051695d29d58bba1ffaf1b8d10d57b40"} Dec 02 07:38:38 crc kubenswrapper[4895]: I1202 07:38:38.829486 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch"] Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.350871 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.357584 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1a93aadf-71d0-4a54-8eb1-fd710b164b07-memberlist\") pod \"speaker-tgv9h\" (UID: \"1a93aadf-71d0-4a54-8eb1-fd710b164b07\") " pod="metallb-system/speaker-tgv9h" Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.431574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mpkw4" event={"ID":"b2bd91a3-f8a0-4abe-9598-4977cc56daa1","Type":"ContainerStarted","Data":"a5ffac6ae1fdbdf5b23e6e09c82bab0c38578e6c5aae3d84d1dea46c2eddac81"} Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.431634 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mpkw4" event={"ID":"b2bd91a3-f8a0-4abe-9598-4977cc56daa1","Type":"ContainerStarted","Data":"db42f05f68602f324360508a5b581d5d5561e4653ebf5bde9df0d3f5c813e991"} Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.431671 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.432858 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" event={"ID":"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af","Type":"ContainerStarted","Data":"b81aeda7aa58fd0c88b13a8a1225d6b6c12b88c4d4c6f3e30ea609686dd63a08"} Dec 02 07:38:39 crc kubenswrapper[4895]: I1202 07:38:39.639511 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tgv9h" Dec 02 07:38:39 crc kubenswrapper[4895]: W1202 07:38:39.670074 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a93aadf_71d0_4a54_8eb1_fd710b164b07.slice/crio-852187fe7ac2b44e424e995673b4906ff5796e9c54e7c343545e1a2176f3fe08 WatchSource:0}: Error finding container 852187fe7ac2b44e424e995673b4906ff5796e9c54e7c343545e1a2176f3fe08: Status 404 returned error can't find the container with id 852187fe7ac2b44e424e995673b4906ff5796e9c54e7c343545e1a2176f3fe08 Dec 02 07:38:40 crc kubenswrapper[4895]: I1202 07:38:40.476119 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tgv9h" event={"ID":"1a93aadf-71d0-4a54-8eb1-fd710b164b07","Type":"ContainerStarted","Data":"fe164ea3de1aca6f08f9dcd3375e86b77479ed2e23546c9e983b46c3334fe7a3"} Dec 02 07:38:40 crc kubenswrapper[4895]: I1202 07:38:40.476190 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tgv9h" event={"ID":"1a93aadf-71d0-4a54-8eb1-fd710b164b07","Type":"ContainerStarted","Data":"852187fe7ac2b44e424e995673b4906ff5796e9c54e7c343545e1a2176f3fe08"} Dec 02 07:38:41 crc kubenswrapper[4895]: I1202 07:38:41.490674 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tgv9h" event={"ID":"1a93aadf-71d0-4a54-8eb1-fd710b164b07","Type":"ContainerStarted","Data":"05548b82f4c8287239cae428a397d002d4b40240aa4f87cf7d403da21dbb4598"} Dec 02 07:38:41 crc kubenswrapper[4895]: I1202 07:38:41.536560 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-mpkw4" podStartSLOduration=4.536540456 podStartE2EDuration="4.536540456s" podCreationTimestamp="2025-12-02 07:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:38:39.461185924 +0000 UTC m=+930.632045537" watchObservedRunningTime="2025-12-02 07:38:41.536540456 +0000 UTC m=+932.707400069" Dec 02 07:38:42 crc kubenswrapper[4895]: I1202 07:38:42.496893 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tgv9h" Dec 02 07:38:49 crc kubenswrapper[4895]: I1202 07:38:49.644110 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tgv9h" Dec 02 07:38:49 crc kubenswrapper[4895]: I1202 07:38:49.677465 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tgv9h" podStartSLOduration=12.677440362 podStartE2EDuration="12.677440362s" podCreationTimestamp="2025-12-02 07:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:38:41.537601009 +0000 UTC m=+932.708460642" watchObservedRunningTime="2025-12-02 07:38:49.677440362 +0000 UTC m=+940.848299975" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.293334 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt"] Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.294595 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.302294 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.309465 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt"] Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.441423 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94xm\" (UniqueName: \"kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.441523 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.441611 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.542663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.542784 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94xm\" (UniqueName: \"kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.542822 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.543432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.543777 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.564456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94xm\" (UniqueName: \"kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:51 crc kubenswrapper[4895]: I1202 07:38:51.618650 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.206389 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt"] Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.904018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" event={"ID":"d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af","Type":"ContainerStarted","Data":"f64cfc95aab14906ae5f18ae311d7548a90cb6a38823325ac8b673125b962b2e"} Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.904912 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.906091 4895 generic.go:334] "Generic (PLEG): container finished" podID="28ac482a-ea9e-4c36-93b7-89580756a458" containerID="2c69f1e9a5f2b234257a0a39cd98d38d61830d6716674cad97630e22b1b5cf59" exitCode=0 Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.906159 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" event={"ID":"28ac482a-ea9e-4c36-93b7-89580756a458","Type":"ContainerDied","Data":"2c69f1e9a5f2b234257a0a39cd98d38d61830d6716674cad97630e22b1b5cf59"} Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.906186 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" event={"ID":"28ac482a-ea9e-4c36-93b7-89580756a458","Type":"ContainerStarted","Data":"cce173899cf0d861d4f9e71b3cf1fb35b7b173b85a88af91bd4d1bc383756f4f"} Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.908966 4895 generic.go:334] "Generic (PLEG): container finished" podID="bef27c67-d150-4004-bfbf-285c544f72f7" containerID="021e994e7d77bbf6bb3ae1e4288efd9007475303042c28afa5ad53f567f79459" exitCode=0 Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.909024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerDied","Data":"021e994e7d77bbf6bb3ae1e4288efd9007475303042c28afa5ad53f567f79459"} Dec 02 07:38:52 crc kubenswrapper[4895]: I1202 07:38:52.928377 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" podStartSLOduration=2.743527395 podStartE2EDuration="15.928356137s" podCreationTimestamp="2025-12-02 07:38:37 +0000 UTC" firstStartedPulling="2025-12-02 07:38:38.840505637 +0000 UTC m=+930.011365250" lastFinishedPulling="2025-12-02 07:38:52.025334339 +0000 UTC m=+943.196193992" observedRunningTime="2025-12-02 07:38:52.924525158 +0000 UTC m=+944.095384801" watchObservedRunningTime="2025-12-02 07:38:52.928356137 +0000 UTC m=+944.099215750" Dec 02 07:38:53 crc kubenswrapper[4895]: I1202 07:38:53.918909 4895 generic.go:334] "Generic (PLEG): container finished" podID="bef27c67-d150-4004-bfbf-285c544f72f7" containerID="fbcc17675693029c14f1f7cd019993432d6ec8f257a6be60187805044fb9e3bd" exitCode=0 Dec 02 07:38:53 crc kubenswrapper[4895]: I1202 07:38:53.919058 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerDied","Data":"fbcc17675693029c14f1f7cd019993432d6ec8f257a6be60187805044fb9e3bd"} Dec 02 07:38:54 crc kubenswrapper[4895]: I1202 07:38:54.931289 4895 generic.go:334] "Generic (PLEG): container finished" podID="bef27c67-d150-4004-bfbf-285c544f72f7" containerID="b2e26944c9fb45023b5381f7f8a1fe741d28da1870712917db3a7931aa254f10" exitCode=0 Dec 02 07:38:54 crc kubenswrapper[4895]: I1202 07:38:54.931392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerDied","Data":"b2e26944c9fb45023b5381f7f8a1fe741d28da1870712917db3a7931aa254f10"} Dec 02 07:38:57 crc kubenswrapper[4895]: I1202 07:38:57.903116 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-mpkw4" Dec 02 07:38:58 crc kubenswrapper[4895]: I1202 07:38:58.222214 4895 generic.go:334] "Generic (PLEG): container finished" podID="28ac482a-ea9e-4c36-93b7-89580756a458" containerID="5bdb17ee3d180cabc949b73725f21e80320a371c243ae018d0f22abd71ae1c17" exitCode=0 Dec 02 07:38:58 crc kubenswrapper[4895]: I1202 07:38:58.222333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" event={"ID":"28ac482a-ea9e-4c36-93b7-89580756a458","Type":"ContainerDied","Data":"5bdb17ee3d180cabc949b73725f21e80320a371c243ae018d0f22abd71ae1c17"} Dec 02 07:38:58 crc kubenswrapper[4895]: I1202 07:38:58.229516 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"2be4c88d478e22f4be78a8634747156b213d590a9e8c3ce82e6c12cbd80eba29"} Dec 02 07:38:58 crc kubenswrapper[4895]: I1202 07:38:58.229577 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"72351d8581fa372f95eae7de7bcaac1918f8ce2c421b557691c5a06ff9de30ed"} Dec 02 07:38:58 crc kubenswrapper[4895]: I1202 07:38:58.229590 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"2957ef839c2bc625c27cb7d9bf48ade1053eae69f490b8021c28f795f6a94236"} Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.239982 4895 generic.go:334] "Generic (PLEG): container finished" podID="28ac482a-ea9e-4c36-93b7-89580756a458" containerID="19ad44ef64b9c01c359c4a2150b89154930f307033f2cbffcd2d2c6b62ba28c1" exitCode=0 Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.241941 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" event={"ID":"28ac482a-ea9e-4c36-93b7-89580756a458","Type":"ContainerDied","Data":"19ad44ef64b9c01c359c4a2150b89154930f307033f2cbffcd2d2c6b62ba28c1"} Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.249883 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"a239d5649461366165ea39159e95aa7ca155a6b2abd282bc2d8db05694613a19"} Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.249949 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"b58d05956e78cf439ab2ca628db96066b6d0278befced96a05c6b15fb66f3b1c"} Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.249974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hjl4p" event={"ID":"bef27c67-d150-4004-bfbf-285c544f72f7","Type":"ContainerStarted","Data":"6ebb21eb0682dac0de5f916177ebb661e10508b861ba18ec0dc1a4c66e3af69e"} Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.250185 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:38:59 crc kubenswrapper[4895]: I1202 07:38:59.315483 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hjl4p" podStartSLOduration=8.339077786 podStartE2EDuration="22.315453077s" podCreationTimestamp="2025-12-02 07:38:37 +0000 UTC" firstStartedPulling="2025-12-02 07:38:38.021681472 +0000 UTC m=+929.192541085" lastFinishedPulling="2025-12-02 07:38:51.998056763 +0000 UTC m=+943.168916376" observedRunningTime="2025-12-02 07:38:59.308705869 +0000 UTC m=+950.479565562" watchObservedRunningTime="2025-12-02 07:38:59.315453077 +0000 UTC m=+950.486312720" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.628763 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.760308 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94xm\" (UniqueName: \"kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm\") pod \"28ac482a-ea9e-4c36-93b7-89580756a458\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.760417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util\") pod \"28ac482a-ea9e-4c36-93b7-89580756a458\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.760513 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle\") pod \"28ac482a-ea9e-4c36-93b7-89580756a458\" (UID: \"28ac482a-ea9e-4c36-93b7-89580756a458\") " Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.761907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle" (OuterVolumeSpecName: "bundle") pod "28ac482a-ea9e-4c36-93b7-89580756a458" (UID: "28ac482a-ea9e-4c36-93b7-89580756a458"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.769903 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm" (OuterVolumeSpecName: "kube-api-access-c94xm") pod "28ac482a-ea9e-4c36-93b7-89580756a458" (UID: "28ac482a-ea9e-4c36-93b7-89580756a458"). InnerVolumeSpecName "kube-api-access-c94xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.774059 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util" (OuterVolumeSpecName: "util") pod "28ac482a-ea9e-4c36-93b7-89580756a458" (UID: "28ac482a-ea9e-4c36-93b7-89580756a458"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.862401 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.862662 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94xm\" (UniqueName: \"kubernetes.io/projected/28ac482a-ea9e-4c36-93b7-89580756a458-kube-api-access-c94xm\") on node \"crc\" DevicePath \"\"" Dec 02 07:39:00 crc kubenswrapper[4895]: I1202 07:39:00.862724 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28ac482a-ea9e-4c36-93b7-89580756a458-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:39:01 crc kubenswrapper[4895]: I1202 07:39:01.268192 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" event={"ID":"28ac482a-ea9e-4c36-93b7-89580756a458","Type":"ContainerDied","Data":"cce173899cf0d861d4f9e71b3cf1fb35b7b173b85a88af91bd4d1bc383756f4f"} Dec 02 07:39:01 crc kubenswrapper[4895]: I1202 07:39:01.268241 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce173899cf0d861d4f9e71b3cf1fb35b7b173b85a88af91bd4d1bc383756f4f" Dec 02 07:39:01 crc kubenswrapper[4895]: I1202 07:39:01.268315 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt" Dec 02 07:39:02 crc kubenswrapper[4895]: I1202 07:39:02.772791 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:39:02 crc kubenswrapper[4895]: I1202 07:39:02.822582 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:39:05 crc kubenswrapper[4895]: I1202 07:39:05.473755 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:39:05 crc kubenswrapper[4895]: I1202 07:39:05.474045 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:39:05 crc kubenswrapper[4895]: I1202 07:39:05.474103 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:39:05 crc kubenswrapper[4895]: I1202 07:39:05.474809 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:39:05 crc kubenswrapper[4895]: I1202 07:39:05.474878 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b" gracePeriod=600 Dec 02 07:39:06 crc kubenswrapper[4895]: I1202 07:39:06.302611 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b" exitCode=0 Dec 02 07:39:06 crc kubenswrapper[4895]: I1202 07:39:06.302677 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b"} Dec 02 07:39:06 crc kubenswrapper[4895]: I1202 07:39:06.303554 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740"} Dec 02 07:39:06 crc kubenswrapper[4895]: I1202 07:39:06.303579 4895 scope.go:117] "RemoveContainer" containerID="167d292d0f8d5573649be7da9822e91144b98a316f5ca7c4838bd376ddaed336" Dec 02 07:39:07 crc kubenswrapper[4895]: I1202 07:39:07.795711 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hjl4p" Dec 02 07:39:08 crc kubenswrapper[4895]: I1202 07:39:08.372654 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-h27ch" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.314465 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj"] Dec 02 07:39:09 crc kubenswrapper[4895]: E1202 07:39:09.315097 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="util" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.315112 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="util" Dec 02 07:39:09 crc kubenswrapper[4895]: E1202 07:39:09.315129 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="pull" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.315135 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="pull" Dec 02 07:39:09 crc kubenswrapper[4895]: E1202 07:39:09.315150 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="extract" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.315157 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="extract" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.315259 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ac482a-ea9e-4c36-93b7-89580756a458" containerName="extract" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.315713 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.318662 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.320499 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-knqz9" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.321302 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.326572 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj"] Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.493422 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b6b9142-40d1-455f-a9da-ac88f728dbe1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.493515 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcb9h\" (UniqueName: \"kubernetes.io/projected/9b6b9142-40d1-455f-a9da-ac88f728dbe1-kube-api-access-mcb9h\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.594129 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcb9h\" (UniqueName: \"kubernetes.io/projected/9b6b9142-40d1-455f-a9da-ac88f728dbe1-kube-api-access-mcb9h\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.594219 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b6b9142-40d1-455f-a9da-ac88f728dbe1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.594648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b6b9142-40d1-455f-a9da-ac88f728dbe1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.627666 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcb9h\" (UniqueName: \"kubernetes.io/projected/9b6b9142-40d1-455f-a9da-ac88f728dbe1-kube-api-access-mcb9h\") pod \"cert-manager-operator-controller-manager-64cf6dff88-bdltj\" (UID: \"9b6b9142-40d1-455f-a9da-ac88f728dbe1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:09 crc kubenswrapper[4895]: I1202 07:39:09.666943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" Dec 02 07:39:10 crc kubenswrapper[4895]: I1202 07:39:10.020699 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj"] Dec 02 07:39:10 crc kubenswrapper[4895]: W1202 07:39:10.026915 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b6b9142_40d1_455f_a9da_ac88f728dbe1.slice/crio-ecf6dfe628e72f741275c57561bae0074887c4e73576f46c8c4dd0ea5242625e WatchSource:0}: Error finding container ecf6dfe628e72f741275c57561bae0074887c4e73576f46c8c4dd0ea5242625e: Status 404 returned error can't find the container with id ecf6dfe628e72f741275c57561bae0074887c4e73576f46c8c4dd0ea5242625e Dec 02 07:39:10 crc kubenswrapper[4895]: I1202 07:39:10.330012 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" event={"ID":"9b6b9142-40d1-455f-a9da-ac88f728dbe1","Type":"ContainerStarted","Data":"ecf6dfe628e72f741275c57561bae0074887c4e73576f46c8c4dd0ea5242625e"} Dec 02 07:39:20 crc kubenswrapper[4895]: I1202 07:39:20.442583 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" event={"ID":"9b6b9142-40d1-455f-a9da-ac88f728dbe1","Type":"ContainerStarted","Data":"6c21e39055f5041c40b7b53e677edbccce4a5c910d87da8b419e4d6529bbc02a"} Dec 02 07:39:20 crc kubenswrapper[4895]: I1202 07:39:20.472536 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-bdltj" podStartSLOduration=1.44752354 podStartE2EDuration="11.472512293s" podCreationTimestamp="2025-12-02 07:39:09 +0000 UTC" firstStartedPulling="2025-12-02 07:39:10.028948193 +0000 UTC m=+961.199807806" lastFinishedPulling="2025-12-02 07:39:20.053936946 +0000 UTC m=+971.224796559" observedRunningTime="2025-12-02 07:39:20.469035995 +0000 UTC m=+971.639895608" watchObservedRunningTime="2025-12-02 07:39:20.472512293 +0000 UTC m=+971.643371916" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.419615 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p2gqj"] Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.421269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.430303 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.430464 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7zx9r" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.430517 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.432001 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p2gqj"] Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.618324 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcg6\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-kube-api-access-jkcg6\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.618404 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.721023 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcg6\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-kube-api-access-jkcg6\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.721153 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.747353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcg6\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-kube-api-access-jkcg6\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:22 crc kubenswrapper[4895]: I1202 07:39:22.759020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6f60e46b-cecc-41fc-992a-0b4ae09082fd-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-p2gqj\" (UID: \"6f60e46b-cecc-41fc-992a-0b4ae09082fd\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:23 crc kubenswrapper[4895]: I1202 07:39:23.050386 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:23 crc kubenswrapper[4895]: I1202 07:39:23.698811 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-p2gqj"] Dec 02 07:39:24 crc kubenswrapper[4895]: I1202 07:39:24.500728 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" event={"ID":"6f60e46b-cecc-41fc-992a-0b4ae09082fd","Type":"ContainerStarted","Data":"168449b61be86f9797c1d1be0b6203ae76601e51cad4a016b8d70a966ba03867"} Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.230150 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r"] Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.231456 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.233754 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-njk4r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.244428 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r"] Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.320874 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs29n\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-kube-api-access-vs29n\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.320977 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.422597 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs29n\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-kube-api-access-vs29n\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.422677 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.446012 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs29n\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-kube-api-access-vs29n\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.446006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e031490-b855-4e4c-8159-14cf9a710e98-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-mqb8r\" (UID: \"8e031490-b855-4e4c-8159-14cf9a710e98\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:27 crc kubenswrapper[4895]: I1202 07:39:27.569816 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.292295 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r"] Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.596692 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" event={"ID":"6f60e46b-cecc-41fc-992a-0b4ae09082fd","Type":"ContainerStarted","Data":"13eaa9d3dbc3cfbe8af9db626ec10c2765884a521b96e29d7e9d46e9423c0f20"} Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.596870 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.603605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" event={"ID":"8e031490-b855-4e4c-8159-14cf9a710e98","Type":"ContainerStarted","Data":"39bd6f14a0524ec892bb13bec544e89a52eef14e6101c5bcf5e22dfe04a96099"} Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.603667 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" event={"ID":"8e031490-b855-4e4c-8159-14cf9a710e98","Type":"ContainerStarted","Data":"174ce26376c47e485f3c6c58d9feb63e301afa902348392a782d3a2d6d05d3b4"} Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.626463 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" podStartSLOduration=2.116942839 podStartE2EDuration="12.62642791s" podCreationTimestamp="2025-12-02 07:39:22 +0000 UTC" firstStartedPulling="2025-12-02 07:39:23.723414327 +0000 UTC m=+974.894273940" lastFinishedPulling="2025-12-02 07:39:34.232899398 +0000 UTC m=+985.403759011" observedRunningTime="2025-12-02 07:39:34.62252651 +0000 UTC m=+985.793386113" watchObservedRunningTime="2025-12-02 07:39:34.62642791 +0000 UTC m=+985.797287523" Dec 02 07:39:34 crc kubenswrapper[4895]: I1202 07:39:34.639753 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-mqb8r" podStartSLOduration=7.639713662 podStartE2EDuration="7.639713662s" podCreationTimestamp="2025-12-02 07:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:39:34.636833693 +0000 UTC m=+985.807693326" watchObservedRunningTime="2025-12-02 07:39:34.639713662 +0000 UTC m=+985.810573285" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.693955 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t679q"] Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.696917 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.701451 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t679q"] Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.708040 4895 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8ph9d" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.777320 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ltfx\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-kube-api-access-5ltfx\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.777720 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-bound-sa-token\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.878962 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ltfx\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-kube-api-access-5ltfx\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.879126 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-bound-sa-token\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.911438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-bound-sa-token\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:42 crc kubenswrapper[4895]: I1202 07:39:42.916543 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ltfx\" (UniqueName: \"kubernetes.io/projected/0dc59886-5d5a-4d16-a083-8a14503368fc-kube-api-access-5ltfx\") pod \"cert-manager-86cb77c54b-t679q\" (UID: \"0dc59886-5d5a-4d16-a083-8a14503368fc\") " pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.038328 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-t679q" Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.056199 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-p2gqj" Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.489240 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-t679q"] Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.668146 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-t679q" event={"ID":"0dc59886-5d5a-4d16-a083-8a14503368fc","Type":"ContainerStarted","Data":"6eb9117e97c1ac0e6279557f9d01af32cb567d948262cec4a9864acfc6e6f96c"} Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.668895 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-t679q" event={"ID":"0dc59886-5d5a-4d16-a083-8a14503368fc","Type":"ContainerStarted","Data":"a1e3b11fdfccb92567352fbce8dc976f62d6475e63b288b63c23ea5114206eae"} Dec 02 07:39:43 crc kubenswrapper[4895]: I1202 07:39:43.694032 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-t679q" podStartSLOduration=1.693998483 podStartE2EDuration="1.693998483s" podCreationTimestamp="2025-12-02 07:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:39:43.688862923 +0000 UTC m=+994.859722556" watchObservedRunningTime="2025-12-02 07:39:43.693998483 +0000 UTC m=+994.864858116" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.648216 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.649429 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.651249 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gtqk9" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.651827 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.652237 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.664218 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.751109 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dptd\" (UniqueName: \"kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd\") pod \"openstack-operator-index-nj66m\" (UID: \"14b0d096-58e1-4eae-a783-6d86b64ffead\") " pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.852329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dptd\" (UniqueName: \"kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd\") pod \"openstack-operator-index-nj66m\" (UID: \"14b0d096-58e1-4eae-a783-6d86b64ffead\") " pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.872770 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dptd\" (UniqueName: \"kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd\") pod \"openstack-operator-index-nj66m\" (UID: \"14b0d096-58e1-4eae-a783-6d86b64ffead\") " pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:46 crc kubenswrapper[4895]: I1202 07:39:46.983615 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:47 crc kubenswrapper[4895]: I1202 07:39:47.241832 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:47 crc kubenswrapper[4895]: I1202 07:39:47.713751 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nj66m" event={"ID":"14b0d096-58e1-4eae-a783-6d86b64ffead","Type":"ContainerStarted","Data":"ebf3de5eb7dc10b8ec533765566525567aeafcf955831ae5fc54e80e138d8cd8"} Dec 02 07:39:50 crc kubenswrapper[4895]: I1202 07:39:50.817609 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.442907 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wg797"] Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.444673 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.451128 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wg797"] Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.533425 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tssv\" (UniqueName: \"kubernetes.io/projected/87c87c33-2ac6-4223-870c-aa91961f9952-kube-api-access-2tssv\") pod \"openstack-operator-index-wg797\" (UID: \"87c87c33-2ac6-4223-870c-aa91961f9952\") " pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.635028 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tssv\" (UniqueName: \"kubernetes.io/projected/87c87c33-2ac6-4223-870c-aa91961f9952-kube-api-access-2tssv\") pod \"openstack-operator-index-wg797\" (UID: \"87c87c33-2ac6-4223-870c-aa91961f9952\") " pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.670371 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tssv\" (UniqueName: \"kubernetes.io/projected/87c87c33-2ac6-4223-870c-aa91961f9952-kube-api-access-2tssv\") pod \"openstack-operator-index-wg797\" (UID: \"87c87c33-2ac6-4223-870c-aa91961f9952\") " pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.750883 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nj66m" event={"ID":"14b0d096-58e1-4eae-a783-6d86b64ffead","Type":"ContainerStarted","Data":"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d"} Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.751145 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nj66m" podUID="14b0d096-58e1-4eae-a783-6d86b64ffead" containerName="registry-server" containerID="cri-o://10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d" gracePeriod=2 Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.775100 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:39:51 crc kubenswrapper[4895]: I1202 07:39:51.776715 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nj66m" podStartSLOduration=2.182075124 podStartE2EDuration="5.776666539s" podCreationTimestamp="2025-12-02 07:39:46 +0000 UTC" firstStartedPulling="2025-12-02 07:39:47.247392043 +0000 UTC m=+998.418251656" lastFinishedPulling="2025-12-02 07:39:50.841983458 +0000 UTC m=+1002.012843071" observedRunningTime="2025-12-02 07:39:51.774508532 +0000 UTC m=+1002.945368145" watchObservedRunningTime="2025-12-02 07:39:51.776666539 +0000 UTC m=+1002.947526162" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.131424 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wg797"] Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.374435 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.510242 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dptd\" (UniqueName: \"kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd\") pod \"14b0d096-58e1-4eae-a783-6d86b64ffead\" (UID: \"14b0d096-58e1-4eae-a783-6d86b64ffead\") " Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.518187 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd" (OuterVolumeSpecName: "kube-api-access-2dptd") pod "14b0d096-58e1-4eae-a783-6d86b64ffead" (UID: "14b0d096-58e1-4eae-a783-6d86b64ffead"). InnerVolumeSpecName "kube-api-access-2dptd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.611962 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dptd\" (UniqueName: \"kubernetes.io/projected/14b0d096-58e1-4eae-a783-6d86b64ffead-kube-api-access-2dptd\") on node \"crc\" DevicePath \"\"" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.759640 4895 generic.go:334] "Generic (PLEG): container finished" podID="14b0d096-58e1-4eae-a783-6d86b64ffead" containerID="10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d" exitCode=0 Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.759728 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nj66m" event={"ID":"14b0d096-58e1-4eae-a783-6d86b64ffead","Type":"ContainerDied","Data":"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d"} Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.759728 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nj66m" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.759788 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nj66m" event={"ID":"14b0d096-58e1-4eae-a783-6d86b64ffead","Type":"ContainerDied","Data":"ebf3de5eb7dc10b8ec533765566525567aeafcf955831ae5fc54e80e138d8cd8"} Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.759818 4895 scope.go:117] "RemoveContainer" containerID="10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.761228 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wg797" event={"ID":"87c87c33-2ac6-4223-870c-aa91961f9952","Type":"ContainerStarted","Data":"e982534584009fb8cdd973ea51f3d80591ecd2529d4ab9b68da68d5c1324349f"} Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.761295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wg797" event={"ID":"87c87c33-2ac6-4223-870c-aa91961f9952","Type":"ContainerStarted","Data":"d00c4d2252e6fe7d179b023900cfd699225bae0549c43f6d88c31b5ca12422ce"} Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.774833 4895 scope.go:117] "RemoveContainer" containerID="10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d" Dec 02 07:39:52 crc kubenswrapper[4895]: E1202 07:39:52.776422 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d\": container with ID starting with 10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d not found: ID does not exist" containerID="10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.776493 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d"} err="failed to get container status \"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d\": rpc error: code = NotFound desc = could not find container \"10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d\": container with ID starting with 10ccedcb2d0369662c1706bc8fa67563f1be7a5c9a48ded0cf6bec1491d2ed4d not found: ID does not exist" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.784948 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wg797" podStartSLOduration=1.723994159 podStartE2EDuration="1.784923875s" podCreationTimestamp="2025-12-02 07:39:51 +0000 UTC" firstStartedPulling="2025-12-02 07:39:52.148894262 +0000 UTC m=+1003.319753875" lastFinishedPulling="2025-12-02 07:39:52.209823978 +0000 UTC m=+1003.380683591" observedRunningTime="2025-12-02 07:39:52.777153815 +0000 UTC m=+1003.948013438" watchObservedRunningTime="2025-12-02 07:39:52.784923875 +0000 UTC m=+1003.955783488" Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.794708 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:52 crc kubenswrapper[4895]: I1202 07:39:52.798799 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nj66m"] Dec 02 07:39:53 crc kubenswrapper[4895]: I1202 07:39:53.149106 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b0d096-58e1-4eae-a783-6d86b64ffead" path="/var/lib/kubelet/pods/14b0d096-58e1-4eae-a783-6d86b64ffead/volumes" Dec 02 07:40:01 crc kubenswrapper[4895]: I1202 07:40:01.775453 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:40:01 crc kubenswrapper[4895]: I1202 07:40:01.775877 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:40:01 crc kubenswrapper[4895]: I1202 07:40:01.820174 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:40:01 crc kubenswrapper[4895]: I1202 07:40:01.866941 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wg797" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.302988 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28"] Dec 02 07:40:02 crc kubenswrapper[4895]: E1202 07:40:02.303308 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b0d096-58e1-4eae-a783-6d86b64ffead" containerName="registry-server" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.303331 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b0d096-58e1-4eae-a783-6d86b64ffead" containerName="registry-server" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.303507 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b0d096-58e1-4eae-a783-6d86b64ffead" containerName="registry-server" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.304475 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.307700 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6lvqd" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.320165 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28"] Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.391672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.391750 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24lm\" (UniqueName: \"kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.391851 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.493309 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24lm\" (UniqueName: \"kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.493848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.494044 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.494416 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.494694 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.513516 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24lm\" (UniqueName: \"kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm\") pod \"eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.638913 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:02 crc kubenswrapper[4895]: I1202 07:40:02.866714 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28"] Dec 02 07:40:03 crc kubenswrapper[4895]: I1202 07:40:03.897187 4895 generic.go:334] "Generic (PLEG): container finished" podID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerID="96380d269418e37c5a82b37451f3a9150a34154d013a6f456ea8b06a30f85503" exitCode=0 Dec 02 07:40:03 crc kubenswrapper[4895]: I1202 07:40:03.897266 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" event={"ID":"cf843b0e-9464-4fc2-9121-d1b1128c439f","Type":"ContainerDied","Data":"96380d269418e37c5a82b37451f3a9150a34154d013a6f456ea8b06a30f85503"} Dec 02 07:40:03 crc kubenswrapper[4895]: I1202 07:40:03.897353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" event={"ID":"cf843b0e-9464-4fc2-9121-d1b1128c439f","Type":"ContainerStarted","Data":"a1f5ea205013503c26ee9e217a5bba0d8e73644d39e29ad17197449f92ddb085"} Dec 02 07:40:04 crc kubenswrapper[4895]: I1202 07:40:04.905127 4895 generic.go:334] "Generic (PLEG): container finished" podID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerID="16bbf06a866e2204895fbf08e12d822d1c257347df1ee77216cfdb595bc40d86" exitCode=0 Dec 02 07:40:04 crc kubenswrapper[4895]: I1202 07:40:04.905250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" event={"ID":"cf843b0e-9464-4fc2-9121-d1b1128c439f","Type":"ContainerDied","Data":"16bbf06a866e2204895fbf08e12d822d1c257347df1ee77216cfdb595bc40d86"} Dec 02 07:40:05 crc kubenswrapper[4895]: I1202 07:40:05.915052 4895 generic.go:334] "Generic (PLEG): container finished" podID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerID="3050c4952381bba6f84393d83ef15d0c6755532825ffe2b9bbe3c307a6950ad3" exitCode=0 Dec 02 07:40:05 crc kubenswrapper[4895]: I1202 07:40:05.915085 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" event={"ID":"cf843b0e-9464-4fc2-9121-d1b1128c439f","Type":"ContainerDied","Data":"3050c4952381bba6f84393d83ef15d0c6755532825ffe2b9bbe3c307a6950ad3"} Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.356577 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.380450 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g24lm\" (UniqueName: \"kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm\") pod \"cf843b0e-9464-4fc2-9121-d1b1128c439f\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.380589 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle\") pod \"cf843b0e-9464-4fc2-9121-d1b1128c439f\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.380673 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util\") pod \"cf843b0e-9464-4fc2-9121-d1b1128c439f\" (UID: \"cf843b0e-9464-4fc2-9121-d1b1128c439f\") " Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.381807 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle" (OuterVolumeSpecName: "bundle") pod "cf843b0e-9464-4fc2-9121-d1b1128c439f" (UID: "cf843b0e-9464-4fc2-9121-d1b1128c439f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.386634 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm" (OuterVolumeSpecName: "kube-api-access-g24lm") pod "cf843b0e-9464-4fc2-9121-d1b1128c439f" (UID: "cf843b0e-9464-4fc2-9121-d1b1128c439f"). InnerVolumeSpecName "kube-api-access-g24lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.395438 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util" (OuterVolumeSpecName: "util") pod "cf843b0e-9464-4fc2-9121-d1b1128c439f" (UID: "cf843b0e-9464-4fc2-9121-d1b1128c439f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.481806 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g24lm\" (UniqueName: \"kubernetes.io/projected/cf843b0e-9464-4fc2-9121-d1b1128c439f-kube-api-access-g24lm\") on node \"crc\" DevicePath \"\"" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.481839 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.481849 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf843b0e-9464-4fc2-9121-d1b1128c439f-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.931366 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" event={"ID":"cf843b0e-9464-4fc2-9121-d1b1128c439f","Type":"ContainerDied","Data":"a1f5ea205013503c26ee9e217a5bba0d8e73644d39e29ad17197449f92ddb085"} Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.931409 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1f5ea205013503c26ee9e217a5bba0d8e73644d39e29ad17197449f92ddb085" Dec 02 07:40:07 crc kubenswrapper[4895]: I1202 07:40:07.931428 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.527206 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68"] Dec 02 07:40:15 crc kubenswrapper[4895]: E1202 07:40:15.528222 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="extract" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.528238 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="extract" Dec 02 07:40:15 crc kubenswrapper[4895]: E1202 07:40:15.528261 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="pull" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.528268 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="pull" Dec 02 07:40:15 crc kubenswrapper[4895]: E1202 07:40:15.528278 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="util" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.528285 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="util" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.528407 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf843b0e-9464-4fc2-9121-d1b1128c439f" containerName="extract" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.528869 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.539169 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-cd9gk" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.565978 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68"] Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.625406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8cw\" (UniqueName: \"kubernetes.io/projected/63cf2176-3acd-461d-9fda-3f2337a37452-kube-api-access-2k8cw\") pod \"openstack-operator-controller-operator-6d58ccd9c-c8j68\" (UID: \"63cf2176-3acd-461d-9fda-3f2337a37452\") " pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.726702 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8cw\" (UniqueName: \"kubernetes.io/projected/63cf2176-3acd-461d-9fda-3f2337a37452-kube-api-access-2k8cw\") pod \"openstack-operator-controller-operator-6d58ccd9c-c8j68\" (UID: \"63cf2176-3acd-461d-9fda-3f2337a37452\") " pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.747188 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8cw\" (UniqueName: \"kubernetes.io/projected/63cf2176-3acd-461d-9fda-3f2337a37452-kube-api-access-2k8cw\") pod \"openstack-operator-controller-operator-6d58ccd9c-c8j68\" (UID: \"63cf2176-3acd-461d-9fda-3f2337a37452\") " pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:15 crc kubenswrapper[4895]: I1202 07:40:15.859343 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:16 crc kubenswrapper[4895]: I1202 07:40:16.348267 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68"] Dec 02 07:40:17 crc kubenswrapper[4895]: I1202 07:40:17.013624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" event={"ID":"63cf2176-3acd-461d-9fda-3f2337a37452","Type":"ContainerStarted","Data":"2a230444b8c3016edd9434849ce7ec19e5f5f960854073e2474995245a99354c"} Dec 02 07:40:23 crc kubenswrapper[4895]: I1202 07:40:23.099688 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" event={"ID":"63cf2176-3acd-461d-9fda-3f2337a37452","Type":"ContainerStarted","Data":"ba82d86f0297c9a43ea2fba1dc0bdf7cb0304a92d1ed0299925c4f8b1abca79c"} Dec 02 07:40:23 crc kubenswrapper[4895]: I1202 07:40:23.100401 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:23 crc kubenswrapper[4895]: I1202 07:40:23.137402 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" podStartSLOduration=1.8625816469999998 podStartE2EDuration="8.137380375s" podCreationTimestamp="2025-12-02 07:40:15 +0000 UTC" firstStartedPulling="2025-12-02 07:40:16.362916523 +0000 UTC m=+1027.533776136" lastFinishedPulling="2025-12-02 07:40:22.637715241 +0000 UTC m=+1033.808574864" observedRunningTime="2025-12-02 07:40:23.132214586 +0000 UTC m=+1034.303074199" watchObservedRunningTime="2025-12-02 07:40:23.137380375 +0000 UTC m=+1034.308239988" Dec 02 07:40:35 crc kubenswrapper[4895]: I1202 07:40:35.862837 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6d58ccd9c-c8j68" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.457913 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.471816 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.473326 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.483426 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-d7zx8" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.503353 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.505100 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.512389 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.517364 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.540450 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.540495 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.541253 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.541628 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.541952 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.558011 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hjmll" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.558554 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8hqwr" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.558618 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-t6cr5" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.559733 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.559788 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.559807 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.559962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.563949 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.565017 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.566444 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.573877 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.575263 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gbckh" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.575596 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hpv5h" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.575773 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4hwws" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.584193 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7585h\" (UniqueName: \"kubernetes.io/projected/73f50459-103c-461c-a71a-95e93d66c4c2-kube-api-access-7585h\") pod \"barbican-operator-controller-manager-7d9dfd778-w52df\" (UID: \"73f50459-103c-461c-a71a-95e93d66c4c2\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.584244 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvvh\" (UniqueName: \"kubernetes.io/projected/f6776d5f-3c3e-48b5-a6fd-30ff153345c2-kube-api-access-bwvvh\") pod \"designate-operator-controller-manager-78b4bc895b-m8cp2\" (UID: \"f6776d5f-3c3e-48b5-a6fd-30ff153345c2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.584308 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msm2f\" (UniqueName: \"kubernetes.io/projected/582c057b-7217-47bf-b2d7-f691861668c3-kube-api-access-msm2f\") pod \"cinder-operator-controller-manager-859b6ccc6-74cpb\" (UID: \"582c057b-7217-47bf-b2d7-f691861668c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.584334 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jds6l\" (UniqueName: \"kubernetes.io/projected/442a4a4d-98fb-4869-9418-7f8f3ff4644b-kube-api-access-jds6l\") pod \"heat-operator-controller-manager-5f64f6f8bb-mgtj6\" (UID: \"442a4a4d-98fb-4869-9418-7f8f3ff4644b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.589329 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.597730 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.612824 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.614216 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.616696 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h4cpt" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.645773 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.659305 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.662378 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.670950 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685328 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkgk4\" (UniqueName: \"kubernetes.io/projected/4c5e704b-8d64-4341-abfe-da2df788ba5c-kube-api-access-fkgk4\") pod \"ironic-operator-controller-manager-6c548fd776-cp9zj\" (UID: \"4c5e704b-8d64-4341-abfe-da2df788ba5c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685340 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dvg4w" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jjjh\" (UniqueName: \"kubernetes.io/projected/362234fb-b096-48e0-9be1-bed6b3e1dcf6-kube-api-access-7jjjh\") pod \"glance-operator-controller-manager-668d9c48b9-g8gjq\" (UID: \"362234fb-b096-48e0-9be1-bed6b3e1dcf6\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685467 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrprl\" (UniqueName: \"kubernetes.io/projected/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-kube-api-access-lrprl\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msm2f\" (UniqueName: \"kubernetes.io/projected/582c057b-7217-47bf-b2d7-f691861668c3-kube-api-access-msm2f\") pod \"cinder-operator-controller-manager-859b6ccc6-74cpb\" (UID: \"582c057b-7217-47bf-b2d7-f691861668c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jds6l\" (UniqueName: \"kubernetes.io/projected/442a4a4d-98fb-4869-9418-7f8f3ff4644b-kube-api-access-jds6l\") pod \"heat-operator-controller-manager-5f64f6f8bb-mgtj6\" (UID: \"442a4a4d-98fb-4869-9418-7f8f3ff4644b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685599 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfxpf\" (UniqueName: \"kubernetes.io/projected/806c9a5b-16ad-499d-8625-ec9124baca56-kube-api-access-vfxpf\") pod \"horizon-operator-controller-manager-68c6d99b8f-sbrp2\" (UID: \"806c9a5b-16ad-499d-8625-ec9124baca56\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7585h\" (UniqueName: \"kubernetes.io/projected/73f50459-103c-461c-a71a-95e93d66c4c2-kube-api-access-7585h\") pod \"barbican-operator-controller-manager-7d9dfd778-w52df\" (UID: \"73f50459-103c-461c-a71a-95e93d66c4c2\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.685794 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvvh\" (UniqueName: \"kubernetes.io/projected/f6776d5f-3c3e-48b5-a6fd-30ff153345c2-kube-api-access-bwvvh\") pod \"designate-operator-controller-manager-78b4bc895b-m8cp2\" (UID: \"f6776d5f-3c3e-48b5-a6fd-30ff153345c2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.715239 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.716918 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.718938 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b7mfn" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.726385 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jds6l\" (UniqueName: \"kubernetes.io/projected/442a4a4d-98fb-4869-9418-7f8f3ff4644b-kube-api-access-jds6l\") pod \"heat-operator-controller-manager-5f64f6f8bb-mgtj6\" (UID: \"442a4a4d-98fb-4869-9418-7f8f3ff4644b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.727187 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7585h\" (UniqueName: \"kubernetes.io/projected/73f50459-103c-461c-a71a-95e93d66c4c2-kube-api-access-7585h\") pod \"barbican-operator-controller-manager-7d9dfd778-w52df\" (UID: \"73f50459-103c-461c-a71a-95e93d66c4c2\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.727384 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msm2f\" (UniqueName: \"kubernetes.io/projected/582c057b-7217-47bf-b2d7-f691861668c3-kube-api-access-msm2f\") pod \"cinder-operator-controller-manager-859b6ccc6-74cpb\" (UID: \"582c057b-7217-47bf-b2d7-f691861668c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.730924 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.732021 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.739930 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nhrj7" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.741113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvvh\" (UniqueName: \"kubernetes.io/projected/f6776d5f-3c3e-48b5-a6fd-30ff153345c2-kube-api-access-bwvvh\") pod \"designate-operator-controller-manager-78b4bc895b-m8cp2\" (UID: \"f6776d5f-3c3e-48b5-a6fd-30ff153345c2\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.763040 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkgk4\" (UniqueName: \"kubernetes.io/projected/4c5e704b-8d64-4341-abfe-da2df788ba5c-kube-api-access-fkgk4\") pod \"ironic-operator-controller-manager-6c548fd776-cp9zj\" (UID: \"4c5e704b-8d64-4341-abfe-da2df788ba5c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787785 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jjjh\" (UniqueName: \"kubernetes.io/projected/362234fb-b096-48e0-9be1-bed6b3e1dcf6-kube-api-access-7jjjh\") pod \"glance-operator-controller-manager-668d9c48b9-g8gjq\" (UID: \"362234fb-b096-48e0-9be1-bed6b3e1dcf6\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrprl\" (UniqueName: \"kubernetes.io/projected/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-kube-api-access-lrprl\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787873 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltmr2\" (UniqueName: \"kubernetes.io/projected/392adf20-0169-4258-9f4f-bb293bd5f8e8-kube-api-access-ltmr2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-qxbgv\" (UID: \"392adf20-0169-4258-9f4f-bb293bd5f8e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787907 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfxpf\" (UniqueName: \"kubernetes.io/projected/806c9a5b-16ad-499d-8625-ec9124baca56-kube-api-access-vfxpf\") pod \"horizon-operator-controller-manager-68c6d99b8f-sbrp2\" (UID: \"806c9a5b-16ad-499d-8625-ec9124baca56\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787951 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cwg\" (UniqueName: \"kubernetes.io/projected/170932c6-4350-4209-ba99-ff53eecd81ee-kube-api-access-p4cwg\") pod \"manila-operator-controller-manager-6546668bfd-5xsjx\" (UID: \"170932c6-4350-4209-ba99-ff53eecd81ee\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.787968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pfp\" (UniqueName: \"kubernetes.io/projected/cb79c25e-42b0-4c89-b756-89d97afeea8a-kube-api-access-52pfp\") pod \"keystone-operator-controller-manager-546d4bdf48-z6fb4\" (UID: \"cb79c25e-42b0-4c89-b756-89d97afeea8a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:40:55 crc kubenswrapper[4895]: E1202 07:40:55.788138 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:55 crc kubenswrapper[4895]: E1202 07:40:55.788199 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert podName:86fe6ea0-2ba9-46f8-9a71-1b990d841e31 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:56.288174134 +0000 UTC m=+1067.459033747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert") pod "infra-operator-controller-manager-57548d458d-wmsb6" (UID: "86fe6ea0-2ba9-46f8-9a71-1b990d841e31") : secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.811438 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.812403 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrprl\" (UniqueName: \"kubernetes.io/projected/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-kube-api-access-lrprl\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.813039 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.815354 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.819356 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jjjh\" (UniqueName: \"kubernetes.io/projected/362234fb-b096-48e0-9be1-bed6b3e1dcf6-kube-api-access-7jjjh\") pod \"glance-operator-controller-manager-668d9c48b9-g8gjq\" (UID: \"362234fb-b096-48e0-9be1-bed6b3e1dcf6\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.820239 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pkvw6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.821122 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkgk4\" (UniqueName: \"kubernetes.io/projected/4c5e704b-8d64-4341-abfe-da2df788ba5c-kube-api-access-fkgk4\") pod \"ironic-operator-controller-manager-6c548fd776-cp9zj\" (UID: \"4c5e704b-8d64-4341-abfe-da2df788ba5c\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.839691 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.841049 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfxpf\" (UniqueName: \"kubernetes.io/projected/806c9a5b-16ad-499d-8625-ec9124baca56-kube-api-access-vfxpf\") pod \"horizon-operator-controller-manager-68c6d99b8f-sbrp2\" (UID: \"806c9a5b-16ad-499d-8625-ec9124baca56\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.853397 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.877839 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt"] Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.879397 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.883936 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bvrp5" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.892388 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltmr2\" (UniqueName: \"kubernetes.io/projected/392adf20-0169-4258-9f4f-bb293bd5f8e8-kube-api-access-ltmr2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-qxbgv\" (UID: \"392adf20-0169-4258-9f4f-bb293bd5f8e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.892449 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw4j\" (UniqueName: \"kubernetes.io/projected/ca1b5423-1f2c-4b12-9ae9-f65bb5301c51-kube-api-access-2gw4j\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9zczx\" (UID: \"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.892513 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cwg\" (UniqueName: \"kubernetes.io/projected/170932c6-4350-4209-ba99-ff53eecd81ee-kube-api-access-p4cwg\") pod \"manila-operator-controller-manager-6546668bfd-5xsjx\" (UID: \"170932c6-4350-4209-ba99-ff53eecd81ee\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.892536 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pfp\" (UniqueName: \"kubernetes.io/projected/cb79c25e-42b0-4c89-b756-89d97afeea8a-kube-api-access-52pfp\") pod \"keystone-operator-controller-manager-546d4bdf48-z6fb4\" (UID: \"cb79c25e-42b0-4c89-b756-89d97afeea8a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.897078 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.914406 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.917613 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cwg\" (UniqueName: \"kubernetes.io/projected/170932c6-4350-4209-ba99-ff53eecd81ee-kube-api-access-p4cwg\") pod \"manila-operator-controller-manager-6546668bfd-5xsjx\" (UID: \"170932c6-4350-4209-ba99-ff53eecd81ee\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.918963 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pfp\" (UniqueName: \"kubernetes.io/projected/cb79c25e-42b0-4c89-b756-89d97afeea8a-kube-api-access-52pfp\") pod \"keystone-operator-controller-manager-546d4bdf48-z6fb4\" (UID: \"cb79c25e-42b0-4c89-b756-89d97afeea8a\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.926712 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.929872 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.932602 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltmr2\" (UniqueName: \"kubernetes.io/projected/392adf20-0169-4258-9f4f-bb293bd5f8e8-kube-api-access-ltmr2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-qxbgv\" (UID: \"392adf20-0169-4258-9f4f-bb293bd5f8e8\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:40:55 crc kubenswrapper[4895]: I1202 07:40:55.952410 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:55.998334 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.012903 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.017356 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cswj\" (UniqueName: \"kubernetes.io/projected/2a05da0d-5cc8-4656-8cf4-96b96077d708-kube-api-access-7cswj\") pod \"nova-operator-controller-manager-697bc559fc-4hpbt\" (UID: \"2a05da0d-5cc8-4656-8cf4-96b96077d708\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.017526 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw4j\" (UniqueName: \"kubernetes.io/projected/ca1b5423-1f2c-4b12-9ae9-f65bb5301c51-kube-api-access-2gw4j\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9zczx\" (UID: \"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.024371 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.029294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.049357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw4j\" (UniqueName: \"kubernetes.io/projected/ca1b5423-1f2c-4b12-9ae9-f65bb5301c51-kube-api-access-2gw4j\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9zczx\" (UID: \"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.075332 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.076550 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.081577 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.113546 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.114593 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.119064 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cswj\" (UniqueName: \"kubernetes.io/projected/2a05da0d-5cc8-4656-8cf4-96b96077d708-kube-api-access-7cswj\") pod \"nova-operator-controller-manager-697bc559fc-4hpbt\" (UID: \"2a05da0d-5cc8-4656-8cf4-96b96077d708\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.119121 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdj77\" (UniqueName: \"kubernetes.io/projected/98c4ec72-ccff-439b-af96-53775411d965-kube-api-access-bdj77\") pod \"octavia-operator-controller-manager-998648c74-2mrjr\" (UID: \"98c4ec72-ccff-439b-af96-53775411d965\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.121587 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.136891 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.138142 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.139101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z7c6c" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.139280 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-zl6rk" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.145665 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.149697 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ktxfs" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.196811 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.196891 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q6n97"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.198318 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.202034 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-95w9m" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.208972 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cswj\" (UniqueName: \"kubernetes.io/projected/2a05da0d-5cc8-4656-8cf4-96b96077d708-kube-api-access-7cswj\") pod \"nova-operator-controller-manager-697bc559fc-4hpbt\" (UID: \"2a05da0d-5cc8-4656-8cf4-96b96077d708\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.220595 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdj77\" (UniqueName: \"kubernetes.io/projected/98c4ec72-ccff-439b-af96-53775411d965-kube-api-access-bdj77\") pod \"octavia-operator-controller-manager-998648c74-2mrjr\" (UID: \"98c4ec72-ccff-439b-af96-53775411d965\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.220775 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbwz2\" (UniqueName: \"kubernetes.io/projected/f4639bb3-56b6-498a-bb7d-ab26b46fe806-kube-api-access-lbwz2\") pod \"ovn-operator-controller-manager-b6456fdb6-4642q\" (UID: \"f4639bb3-56b6-498a-bb7d-ab26b46fe806\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.220859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.220952 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbm4x\" (UniqueName: \"kubernetes.io/projected/b8dc2edd-3bab-4a5d-a994-ba2212e85045-kube-api-access-rbm4x\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.221034 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fx68\" (UniqueName: \"kubernetes.io/projected/991002a6-abdd-43e4-b22d-1d95383d3b96-kube-api-access-2fx68\") pod \"placement-operator-controller-manager-78f8948974-q6n97\" (UID: \"991002a6-abdd-43e4-b22d-1d95383d3b96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.238896 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.246797 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q6n97"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.270188 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.277887 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.279931 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.285375 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-95xx7" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.288577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdj77\" (UniqueName: \"kubernetes.io/projected/98c4ec72-ccff-439b-af96-53775411d965-kube-api-access-bdj77\") pod \"octavia-operator-controller-manager-998648c74-2mrjr\" (UID: \"98c4ec72-ccff-439b-af96-53775411d965\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.296654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.298153 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.314131 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hznqs" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.321429 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322137 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbwz2\" (UniqueName: \"kubernetes.io/projected/f4639bb3-56b6-498a-bb7d-ab26b46fe806-kube-api-access-lbwz2\") pod \"ovn-operator-controller-manager-b6456fdb6-4642q\" (UID: \"f4639bb3-56b6-498a-bb7d-ab26b46fe806\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322297 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322350 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5925r\" (UniqueName: \"kubernetes.io/projected/ad99804b-869f-4b42-89ab-d29341434b61-kube-api-access-5925r\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hcnp5\" (UID: \"ad99804b-869f-4b42-89ab-d29341434b61\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322386 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc2c\" (UniqueName: \"kubernetes.io/projected/1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c-kube-api-access-4kc2c\") pod \"swift-operator-controller-manager-5f8c65bbfc-rqm56\" (UID: \"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322414 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbm4x\" (UniqueName: \"kubernetes.io/projected/b8dc2edd-3bab-4a5d-a994-ba2212e85045-kube-api-access-rbm4x\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.322432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fx68\" (UniqueName: \"kubernetes.io/projected/991002a6-abdd-43e4-b22d-1d95383d3b96-kube-api-access-2fx68\") pod \"placement-operator-controller-manager-78f8948974-q6n97\" (UID: \"991002a6-abdd-43e4-b22d-1d95383d3b96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.323551 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.323604 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert podName:86fe6ea0-2ba9-46f8-9a71-1b990d841e31 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:57.323586844 +0000 UTC m=+1068.494446457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert") pod "infra-operator-controller-manager-57548d458d-wmsb6" (UID: "86fe6ea0-2ba9-46f8-9a71-1b990d841e31") : secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.323909 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.323939 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert podName:b8dc2edd-3bab-4a5d-a994-ba2212e85045 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:56.823931465 +0000 UTC m=+1067.994791078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" (UID: "b8dc2edd-3bab-4a5d-a994-ba2212e85045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.335812 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6x68d"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.337032 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.337813 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.345124 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lqdpl" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.360681 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.373843 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbwz2\" (UniqueName: \"kubernetes.io/projected/f4639bb3-56b6-498a-bb7d-ab26b46fe806-kube-api-access-lbwz2\") pod \"ovn-operator-controller-manager-b6456fdb6-4642q\" (UID: \"f4639bb3-56b6-498a-bb7d-ab26b46fe806\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.384599 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6x68d"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.403019 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.417228 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbm4x\" (UniqueName: \"kubernetes.io/projected/b8dc2edd-3bab-4a5d-a994-ba2212e85045-kube-api-access-rbm4x\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.418242 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.421104 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fx68\" (UniqueName: \"kubernetes.io/projected/991002a6-abdd-43e4-b22d-1d95383d3b96-kube-api-access-2fx68\") pod \"placement-operator-controller-manager-78f8948974-q6n97\" (UID: \"991002a6-abdd-43e4-b22d-1d95383d3b96\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.425220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5925r\" (UniqueName: \"kubernetes.io/projected/ad99804b-869f-4b42-89ab-d29341434b61-kube-api-access-5925r\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hcnp5\" (UID: \"ad99804b-869f-4b42-89ab-d29341434b61\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.425269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbf4\" (UniqueName: \"kubernetes.io/projected/a5a01f83-cddf-479d-b6d0-7944d70c0bdd-kube-api-access-pfbf4\") pod \"test-operator-controller-manager-5854674fcc-6x68d\" (UID: \"a5a01f83-cddf-479d-b6d0-7944d70c0bdd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.425295 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc2c\" (UniqueName: \"kubernetes.io/projected/1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c-kube-api-access-4kc2c\") pod \"swift-operator-controller-manager-5f8c65bbfc-rqm56\" (UID: \"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.471984 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.472107 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.478697 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vq85j" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.510416 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.585410 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.586517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbf4\" (UniqueName: \"kubernetes.io/projected/a5a01f83-cddf-479d-b6d0-7944d70c0bdd-kube-api-access-pfbf4\") pod \"test-operator-controller-manager-5854674fcc-6x68d\" (UID: \"a5a01f83-cddf-479d-b6d0-7944d70c0bdd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.616985 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.618274 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.642776 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.667483 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.669443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.681913 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc2c\" (UniqueName: \"kubernetes.io/projected/1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c-kube-api-access-4kc2c\") pod \"swift-operator-controller-manager-5f8c65bbfc-rqm56\" (UID: \"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.684100 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-72czc" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.684316 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nsrcp" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.684431 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.688094 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.702085 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljdc\" (UniqueName: \"kubernetes.io/projected/a2cb057c-0f4a-4220-8666-3ccab3458be2-kube-api-access-hljdc\") pod \"watcher-operator-controller-manager-769dc69bc-d5lnh\" (UID: \"a2cb057c-0f4a-4220-8666-3ccab3458be2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.706798 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl"] Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.706870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5925r\" (UniqueName: \"kubernetes.io/projected/ad99804b-869f-4b42-89ab-d29341434b61-kube-api-access-5925r\") pod \"telemetry-operator-controller-manager-76cc84c6bb-hcnp5\" (UID: \"ad99804b-869f-4b42-89ab-d29341434b61\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.756848 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbf4\" (UniqueName: \"kubernetes.io/projected/a5a01f83-cddf-479d-b6d0-7944d70c0bdd-kube-api-access-pfbf4\") pod \"test-operator-controller-manager-5854674fcc-6x68d\" (UID: \"a5a01f83-cddf-479d-b6d0-7944d70c0bdd\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.804095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.804184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljdc\" (UniqueName: \"kubernetes.io/projected/a2cb057c-0f4a-4220-8666-3ccab3458be2-kube-api-access-hljdc\") pod \"watcher-operator-controller-manager-769dc69bc-d5lnh\" (UID: \"a2cb057c-0f4a-4220-8666-3ccab3458be2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.804248 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28hv\" (UniqueName: \"kubernetes.io/projected/82a2bf22-3682-4982-b4fc-87ac78873cce-kube-api-access-l28hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bhzzl\" (UID: \"82a2bf22-3682-4982-b4fc-87ac78873cce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.804287 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjp26\" (UniqueName: \"kubernetes.io/projected/932b16cb-babd-4cd7-902c-03cd223e98bc-kube-api-access-bjp26\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.804310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.822806 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljdc\" (UniqueName: \"kubernetes.io/projected/a2cb057c-0f4a-4220-8666-3ccab3458be2-kube-api-access-hljdc\") pod \"watcher-operator-controller-manager-769dc69bc-d5lnh\" (UID: \"a2cb057c-0f4a-4220-8666-3ccab3458be2\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.905757 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.905830 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.905903 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28hv\" (UniqueName: \"kubernetes.io/projected/82a2bf22-3682-4982-b4fc-87ac78873cce-kube-api-access-l28hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bhzzl\" (UID: \"82a2bf22-3682-4982-b4fc-87ac78873cce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.905950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjp26\" (UniqueName: \"kubernetes.io/projected/932b16cb-babd-4cd7-902c-03cd223e98bc-kube-api-access-bjp26\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.905981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.906130 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.906210 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:40:57.406183915 +0000 UTC m=+1068.577043528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.907038 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.907069 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert podName:b8dc2edd-3bab-4a5d-a994-ba2212e85045 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:57.907059842 +0000 UTC m=+1069.077919455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" (UID: "b8dc2edd-3bab-4a5d-a994-ba2212e85045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.907126 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: E1202 07:40:56.907161 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:40:57.407150565 +0000 UTC m=+1068.578010178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "metrics-server-cert" not found Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.927479 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.939953 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.973636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28hv\" (UniqueName: \"kubernetes.io/projected/82a2bf22-3682-4982-b4fc-87ac78873cce-kube-api-access-l28hv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bhzzl\" (UID: \"82a2bf22-3682-4982-b4fc-87ac78873cce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.974034 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:40:56 crc kubenswrapper[4895]: I1202 07:40:56.978264 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjp26\" (UniqueName: \"kubernetes.io/projected/932b16cb-babd-4cd7-902c-03cd223e98bc-kube-api-access-bjp26\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.029293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.275635 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.340521 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.340835 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.340898 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert podName:86fe6ea0-2ba9-46f8-9a71-1b990d841e31 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:59.340880821 +0000 UTC m=+1070.511740434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert") pod "infra-operator-controller-manager-57548d458d-wmsb6" (UID: "86fe6ea0-2ba9-46f8-9a71-1b990d841e31") : secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.444279 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.444387 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.444631 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.444710 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:40:58.444690662 +0000 UTC m=+1069.615550285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "metrics-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.445240 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.445286 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:40:58.44527537 +0000 UTC m=+1069.616134993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "webhook-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: I1202 07:40:57.960943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.961294 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:57 crc kubenswrapper[4895]: E1202 07:40:57.961359 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert podName:b8dc2edd-3bab-4a5d-a994-ba2212e85045 nodeName:}" failed. No retries permitted until 2025-12-02 07:40:59.961338802 +0000 UTC m=+1071.132198415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" (UID: "b8dc2edd-3bab-4a5d-a994-ba2212e85045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.474646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.475030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:40:58 crc kubenswrapper[4895]: E1202 07:40:58.475641 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 07:40:58 crc kubenswrapper[4895]: E1202 07:40:58.475701 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:00.475680652 +0000 UTC m=+1071.646540265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "webhook-server-cert" not found Dec 02 07:40:58 crc kubenswrapper[4895]: E1202 07:40:58.476055 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 07:40:58 crc kubenswrapper[4895]: E1202 07:40:58.476088 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:00.476080884 +0000 UTC m=+1071.646940497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "metrics-server-cert" not found Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.583125 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6"] Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.600146 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4"] Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.635996 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df"] Dec 02 07:40:58 crc kubenswrapper[4895]: W1202 07:40:58.645050 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f50459_103c_461c_a71a_95e93d66c4c2.slice/crio-1389c2d9e76f49e583d190f1550a63d5a3c1cd8206ea18d8c64ce29cf87bcedd WatchSource:0}: Error finding container 1389c2d9e76f49e583d190f1550a63d5a3c1cd8206ea18d8c64ce29cf87bcedd: Status 404 returned error can't find the container with id 1389c2d9e76f49e583d190f1550a63d5a3c1cd8206ea18d8c64ce29cf87bcedd Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.661482 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2"] Dec 02 07:40:58 crc kubenswrapper[4895]: I1202 07:40:58.696633 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.011376 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.020276 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.027231 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.033042 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.068175 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.074461 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.094861 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv"] Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.162952 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hljdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-d5lnh_openstack-operators(a2cb057c-0f4a-4220-8666-3ccab3458be2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.165685 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hljdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-d5lnh_openstack-operators(a2cb057c-0f4a-4220-8666-3ccab3458be2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.166876 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" podUID="a2cb057c-0f4a-4220-8666-3ccab3458be2" Dec 02 07:40:59 crc kubenswrapper[4895]: W1202 07:40:59.167103 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4639bb3_56b6_498a_bb7d_ab26b46fe806.slice/crio-6e5b7a61e8a0585022b9a74f3faa2fe459e34543e3d32849618402cd5ce48680 WatchSource:0}: Error finding container 6e5b7a61e8a0585022b9a74f3faa2fe459e34543e3d32849618402cd5ce48680: Status 404 returned error can't find the container with id 6e5b7a61e8a0585022b9a74f3faa2fe459e34543e3d32849618402cd5ce48680 Dec 02 07:40:59 crc kubenswrapper[4895]: W1202 07:40:59.170881 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad99804b_869f_4b42_89ab_d29341434b61.slice/crio-86a6484b936c0c7f3e1080433eca6cc378228cb79e5a31815b12da1aa52a38c9 WatchSource:0}: Error finding container 86a6484b936c0c7f3e1080433eca6cc378228cb79e5a31815b12da1aa52a38c9: Status 404 returned error can't find the container with id 86a6484b936c0c7f3e1080433eca6cc378228cb79e5a31815b12da1aa52a38c9 Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.179416 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pfbf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-6x68d_openstack-operators(a5a01f83-cddf-479d-b6d0-7944d70c0bdd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.185165 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbwz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-4642q_openstack-operators(f4639bb3-56b6-498a-bb7d-ab26b46fe806): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.185710 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pfbf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-6x68d_openstack-operators(a5a01f83-cddf-479d-b6d0-7944d70c0bdd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.186845 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" podUID="a5a01f83-cddf-479d-b6d0-7944d70c0bdd" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.187924 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-rqm56_openstack-operators(1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.188573 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbwz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-4642q_openstack-operators(f4639bb3-56b6-498a-bb7d-ab26b46fe806): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.188696 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l28hv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-bhzzl_openstack-operators(82a2bf22-3682-4982-b4fc-87ac78873cce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.189662 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.189699 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.189713 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh"] Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.189780 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podUID="82a2bf22-3682-4982-b4fc-87ac78873cce" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.189824 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" podUID="f4639bb3-56b6-498a-bb7d-ab26b46fe806" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.190660 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-rqm56_openstack-operators(1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.192160 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podUID="1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c" Dec 02 07:40:59 crc kubenswrapper[4895]: W1202 07:40:59.195657 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991002a6_abdd_43e4_b22d_1d95383d3b96.slice/crio-c05f5e7c6018e9998c2e6af27745ff4541ffb9597564c659f1524dc89a82fe49 WatchSource:0}: Error finding container c05f5e7c6018e9998c2e6af27745ff4541ffb9597564c659f1524dc89a82fe49: Status 404 returned error can't find the container with id c05f5e7c6018e9998c2e6af27745ff4541ffb9597564c659f1524dc89a82fe49 Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.197280 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-6x68d"] Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.205385 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5925r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hcnp5_openstack-operators(ad99804b-869f-4b42-89ab-d29341434b61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.209277 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fx68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-q6n97_openstack-operators(991002a6-abdd-43e4-b22d-1d95383d3b96): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.209373 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5925r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-hcnp5_openstack-operators(ad99804b-869f-4b42-89ab-d29341434b61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.211262 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fx68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-q6n97_openstack-operators(991002a6-abdd-43e4-b22d-1d95383d3b96): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.211675 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" podUID="ad99804b-869f-4b42-89ab-d29341434b61" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.212384 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" podUID="991002a6-abdd-43e4-b22d-1d95383d3b96" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.220211 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.226888 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q6n97"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.243057 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.250089 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl"] Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.403092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.403372 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.403459 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert podName:86fe6ea0-2ba9-46f8-9a71-1b990d841e31 nodeName:}" failed. No retries permitted until 2025-12-02 07:41:03.403437038 +0000 UTC m=+1074.574296651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert") pod "infra-operator-controller-manager-57548d458d-wmsb6" (UID: "86fe6ea0-2ba9-46f8-9a71-1b990d841e31") : secret "infra-operator-webhook-server-cert" not found Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.452963 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" event={"ID":"82a2bf22-3682-4982-b4fc-87ac78873cce","Type":"ContainerStarted","Data":"6cefc90893b1fd4b28e187f410e2eaae29c316f699ced53ab59c1645430c7c4c"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.455880 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podUID="82a2bf22-3682-4982-b4fc-87ac78873cce" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.458129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" event={"ID":"582c057b-7217-47bf-b2d7-f691861668c3","Type":"ContainerStarted","Data":"02b1e2e9ddee749f27c68e792081bd08f23779c798b0ab568c2a37ec8d0fa6b0"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.462116 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" event={"ID":"392adf20-0169-4258-9f4f-bb293bd5f8e8","Type":"ContainerStarted","Data":"734eb10c586f48bacb2d55c6d62588a6061c07b3e42df21a6b566c4ff5440855"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.467677 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" event={"ID":"f4639bb3-56b6-498a-bb7d-ab26b46fe806","Type":"ContainerStarted","Data":"6e5b7a61e8a0585022b9a74f3faa2fe459e34543e3d32849618402cd5ce48680"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.469843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" event={"ID":"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51","Type":"ContainerStarted","Data":"090d11b0eaec080bf161c16c57ef1bb253b05e1674726e9fffa4d2398fb1df72"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.471401 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" podUID="f4639bb3-56b6-498a-bb7d-ab26b46fe806" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.472450 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" event={"ID":"ad99804b-869f-4b42-89ab-d29341434b61","Type":"ContainerStarted","Data":"86a6484b936c0c7f3e1080433eca6cc378228cb79e5a31815b12da1aa52a38c9"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.477244 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" podUID="ad99804b-869f-4b42-89ab-d29341434b61" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.477645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" event={"ID":"442a4a4d-98fb-4869-9418-7f8f3ff4644b","Type":"ContainerStarted","Data":"c90a97764c5f35894d4141dda970cb795397568b8c70007fb63ccb8b96d0701b"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.479439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" event={"ID":"a5a01f83-cddf-479d-b6d0-7944d70c0bdd","Type":"ContainerStarted","Data":"25e57952b0161016e01839daba3377fc0d79a2591423e848fb96b97bee6d6a66"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.482548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" event={"ID":"f6776d5f-3c3e-48b5-a6fd-30ff153345c2","Type":"ContainerStarted","Data":"aed2c26192fc00eeb095dbd075dfe3544a4acabe61402eea800fb38be4e7ad8a"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.484336 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" podUID="a5a01f83-cddf-479d-b6d0-7944d70c0bdd" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.486237 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" event={"ID":"806c9a5b-16ad-499d-8625-ec9124baca56","Type":"ContainerStarted","Data":"24493a070298b7062c59f964266bcdd6c1bc331d88a554aa4b02d2e2caaec4eb"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.488828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" event={"ID":"991002a6-abdd-43e4-b22d-1d95383d3b96","Type":"ContainerStarted","Data":"c05f5e7c6018e9998c2e6af27745ff4541ffb9597564c659f1524dc89a82fe49"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.491268 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" event={"ID":"a2cb057c-0f4a-4220-8666-3ccab3458be2","Type":"ContainerStarted","Data":"16c057c328841368cd2f87a13fe4a7a68101e5b2be52304a2866a6696f73f3ee"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.491489 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" podUID="991002a6-abdd-43e4-b22d-1d95383d3b96" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.494609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" event={"ID":"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c","Type":"ContainerStarted","Data":"e67efa14c2f03d223b2f0a68891840ba07dbb138eeba75063eae472e396bfa24"} Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.499525 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" podUID="a2cb057c-0f4a-4220-8666-3ccab3458be2" Dec 02 07:40:59 crc kubenswrapper[4895]: E1202 07:40:59.500464 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podUID="1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c" Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.502589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" event={"ID":"98c4ec72-ccff-439b-af96-53775411d965","Type":"ContainerStarted","Data":"bc9c31646d0ca496b7d5e7681cdfcec3ad495b7affc2d55675f730b923754e7f"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.506789 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" event={"ID":"cb79c25e-42b0-4c89-b756-89d97afeea8a","Type":"ContainerStarted","Data":"4ceacab1dcc1cc536d4fafd3df6e8f8b08e4c920cbb6b5567226478457b5ad56"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.509122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" event={"ID":"73f50459-103c-461c-a71a-95e93d66c4c2","Type":"ContainerStarted","Data":"1389c2d9e76f49e583d190f1550a63d5a3c1cd8206ea18d8c64ce29cf87bcedd"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.515322 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" event={"ID":"2a05da0d-5cc8-4656-8cf4-96b96077d708","Type":"ContainerStarted","Data":"e751146c8a67c94c118944928c6ecb05c144a921e224ade0f5c9217842d3cbb8"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.518598 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" event={"ID":"170932c6-4350-4209-ba99-ff53eecd81ee","Type":"ContainerStarted","Data":"63c70065274330250855dc3b4366d0559eeef0b12359cbd0cc83e85e2061030b"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.520503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" event={"ID":"362234fb-b096-48e0-9be1-bed6b3e1dcf6","Type":"ContainerStarted","Data":"a451ecdb751801fa31aedad56aac958d7d7713713a3776e67f598bb7c97b4956"} Dec 02 07:40:59 crc kubenswrapper[4895]: I1202 07:40:59.523461 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" event={"ID":"4c5e704b-8d64-4341-abfe-da2df788ba5c","Type":"ContainerStarted","Data":"3615e604d7deb1de483f8112ed6125c4b6b5b221cc6dd6af62f6ccbb360c2f0f"} Dec 02 07:41:00 crc kubenswrapper[4895]: I1202 07:41:00.023392 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.023694 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.023874 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert podName:b8dc2edd-3bab-4a5d-a994-ba2212e85045 nodeName:}" failed. No retries permitted until 2025-12-02 07:41:04.023847269 +0000 UTC m=+1075.194706882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" (UID: "b8dc2edd-3bab-4a5d-a994-ba2212e85045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: I1202 07:41:00.536223 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:00 crc kubenswrapper[4895]: I1202 07:41:00.536352 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.536520 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.536588 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:04.536571068 +0000 UTC m=+1075.707430681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "metrics-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.536705 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.536834 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:04.536807304 +0000 UTC m=+1075.707666917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "webhook-server-cert" not found Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.551230 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podUID="82a2bf22-3682-4982-b4fc-87ac78873cce" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.551781 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" podUID="ad99804b-869f-4b42-89ab-d29341434b61" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.553365 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" podUID="991002a6-abdd-43e4-b22d-1d95383d3b96" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.553856 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" podUID="f4639bb3-56b6-498a-bb7d-ab26b46fe806" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.555256 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podUID="1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.557122 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" podUID="a5a01f83-cddf-479d-b6d0-7944d70c0bdd" Dec 02 07:41:00 crc kubenswrapper[4895]: E1202 07:41:00.561888 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" podUID="a2cb057c-0f4a-4220-8666-3ccab3458be2" Dec 02 07:41:03 crc kubenswrapper[4895]: I1202 07:41:03.425904 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:41:03 crc kubenswrapper[4895]: E1202 07:41:03.426147 4895 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 07:41:03 crc kubenswrapper[4895]: E1202 07:41:03.426436 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert podName:86fe6ea0-2ba9-46f8-9a71-1b990d841e31 nodeName:}" failed. No retries permitted until 2025-12-02 07:41:11.426412183 +0000 UTC m=+1082.597271796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert") pod "infra-operator-controller-manager-57548d458d-wmsb6" (UID: "86fe6ea0-2ba9-46f8-9a71-1b990d841e31") : secret "infra-operator-webhook-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: I1202 07:41:04.042844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.043115 4895 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.043217 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert podName:b8dc2edd-3bab-4a5d-a994-ba2212e85045 nodeName:}" failed. No retries permitted until 2025-12-02 07:41:12.043191991 +0000 UTC m=+1083.214051604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" (UID: "b8dc2edd-3bab-4a5d-a994-ba2212e85045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: I1202 07:41:04.552601 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:04 crc kubenswrapper[4895]: I1202 07:41:04.552679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.552832 4895 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.552850 4895 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.552925 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:12.552902327 +0000 UTC m=+1083.723761940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "metrics-server-cert" not found Dec 02 07:41:04 crc kubenswrapper[4895]: E1202 07:41:04.552944 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs podName:932b16cb-babd-4cd7-902c-03cd223e98bc nodeName:}" failed. No retries permitted until 2025-12-02 07:41:12.552937368 +0000 UTC m=+1083.723796981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs") pod "openstack-operator-controller-manager-6979866f9f-j56w9" (UID: "932b16cb-babd-4cd7-902c-03cd223e98bc") : secret "webhook-server-cert" not found Dec 02 07:41:11 crc kubenswrapper[4895]: I1202 07:41:11.442286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:41:11 crc kubenswrapper[4895]: I1202 07:41:11.683839 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86fe6ea0-2ba9-46f8-9a71-1b990d841e31-cert\") pod \"infra-operator-controller-manager-57548d458d-wmsb6\" (UID: \"86fe6ea0-2ba9-46f8-9a71-1b990d841e31\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:41:11 crc kubenswrapper[4895]: I1202 07:41:11.873992 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gbckh" Dec 02 07:41:11 crc kubenswrapper[4895]: I1202 07:41:11.882036 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.051097 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.057087 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8dc2edd-3bab-4a5d-a994-ba2212e85045-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4c299z\" (UID: \"b8dc2edd-3bab-4a5d-a994-ba2212e85045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:41:12 crc kubenswrapper[4895]: E1202 07:41:12.109524 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5" Dec 02 07:41:12 crc kubenswrapper[4895]: E1202 07:41:12.109826 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4cwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-5xsjx_openstack-operators(170932c6-4350-4209-ba99-ff53eecd81ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.145542 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ktxfs" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.156235 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.563173 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.563261 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.567149 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-webhook-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.568654 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/932b16cb-babd-4cd7-902c-03cd223e98bc-metrics-certs\") pod \"openstack-operator-controller-manager-6979866f9f-j56w9\" (UID: \"932b16cb-babd-4cd7-902c-03cd223e98bc\") " pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.843633 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nsrcp" Dec 02 07:41:12 crc kubenswrapper[4895]: I1202 07:41:12.851348 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:15 crc kubenswrapper[4895]: E1202 07:41:15.349206 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172" Dec 02 07:41:15 crc kubenswrapper[4895]: E1202 07:41:15.349908 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jjjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-668d9c48b9-g8gjq_openstack-operators(362234fb-b096-48e0-9be1-bed6b3e1dcf6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:17 crc kubenswrapper[4895]: E1202 07:41:17.243674 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 02 07:41:17 crc kubenswrapper[4895]: E1202 07:41:17.244500 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msm2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-74cpb_openstack-operators(582c057b-7217-47bf-b2d7-f691861668c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:19 crc kubenswrapper[4895]: E1202 07:41:19.292082 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 07:41:19 crc kubenswrapper[4895]: E1202 07:41:19.292640 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ltmr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-qxbgv_openstack-operators(392adf20-0169-4258-9f4f-bb293bd5f8e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:21 crc kubenswrapper[4895]: E1202 07:41:21.751858 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 02 07:41:21 crc kubenswrapper[4895]: E1202 07:41:21.752095 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdj77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-2mrjr_openstack-operators(98c4ec72-ccff-439b-af96-53775411d965): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:23 crc kubenswrapper[4895]: E1202 07:41:23.415221 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 02 07:41:23 crc kubenswrapper[4895]: E1202 07:41:23.415437 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fkgk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-cp9zj_openstack-operators(4c5e704b-8d64-4341-abfe-da2df788ba5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:24 crc kubenswrapper[4895]: E1202 07:41:24.431204 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 02 07:41:24 crc kubenswrapper[4895]: E1202 07:41:24.431952 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwvvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-m8cp2_openstack-operators(f6776d5f-3c3e-48b5-a6fd-30ff153345c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:25 crc kubenswrapper[4895]: E1202 07:41:25.148376 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 02 07:41:25 crc kubenswrapper[4895]: E1202 07:41:25.148698 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52pfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-z6fb4_openstack-operators(cb79c25e-42b0-4c89-b756-89d97afeea8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:29 crc kubenswrapper[4895]: E1202 07:41:29.902741 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 07:41:29 crc kubenswrapper[4895]: E1202 07:41:29.903851 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cswj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-4hpbt_openstack-operators(2a05da0d-5cc8-4656-8cf4-96b96077d708): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:35 crc kubenswrapper[4895]: I1202 07:41:35.473635 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:41:35 crc kubenswrapper[4895]: I1202 07:41:35.474115 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:41:42 crc kubenswrapper[4895]: E1202 07:41:42.666057 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 02 07:41:42 crc kubenswrapper[4895]: E1202 07:41:42.666928 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kc2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-rqm56_openstack-operators(1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:43 crc kubenswrapper[4895]: I1202 07:41:43.492910 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:41:45 crc kubenswrapper[4895]: I1202 07:41:45.137408 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z"] Dec 02 07:41:45 crc kubenswrapper[4895]: E1202 07:41:45.380258 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 07:41:45 crc kubenswrapper[4895]: E1202 07:41:45.380502 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l28hv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-bhzzl_openstack-operators(82a2bf22-3682-4982-b4fc-87ac78873cce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:41:45 crc kubenswrapper[4895]: E1202 07:41:45.381827 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podUID="82a2bf22-3682-4982-b4fc-87ac78873cce" Dec 02 07:41:45 crc kubenswrapper[4895]: W1202 07:41:45.402372 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8dc2edd_3bab_4a5d_a994_ba2212e85045.slice/crio-6563c08ae21d9ccbf05c8d65063adcf1459dcd1b1fd937468bdf38aceb0e3812 WatchSource:0}: Error finding container 6563c08ae21d9ccbf05c8d65063adcf1459dcd1b1fd937468bdf38aceb0e3812: Status 404 returned error can't find the container with id 6563c08ae21d9ccbf05c8d65063adcf1459dcd1b1fd937468bdf38aceb0e3812 Dec 02 07:41:45 crc kubenswrapper[4895]: I1202 07:41:45.815712 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6"] Dec 02 07:41:45 crc kubenswrapper[4895]: I1202 07:41:45.895676 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9"] Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.104872 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.105076 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdj77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-2mrjr_openstack-operators(98c4ec72-ccff-439b-af96-53775411d965): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.106815 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" podUID="98c4ec72-ccff-439b-af96-53775411d965" Dec 02 07:41:46 crc kubenswrapper[4895]: W1202 07:41:46.120674 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86fe6ea0_2ba9_46f8_9a71_1b990d841e31.slice/crio-b9faf8159668a65d27b2585d0e32532631c0669ac20f4b05f8a8cd74aceaaa31 WatchSource:0}: Error finding container b9faf8159668a65d27b2585d0e32532631c0669ac20f4b05f8a8cd74aceaaa31: Status 404 returned error can't find the container with id b9faf8159668a65d27b2585d0e32532631c0669ac20f4b05f8a8cd74aceaaa31 Dec 02 07:41:46 crc kubenswrapper[4895]: W1202 07:41:46.122568 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932b16cb_babd_4cd7_902c_03cd223e98bc.slice/crio-c51618fa7083569b030f6f403a68c2546c91dacaa4eda6e789afd997718f60c1 WatchSource:0}: Error finding container c51618fa7083569b030f6f403a68c2546c91dacaa4eda6e789afd997718f60c1: Status 404 returned error can't find the container with id c51618fa7083569b030f6f403a68c2546c91dacaa4eda6e789afd997718f60c1 Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.221565 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.221877 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwvvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-m8cp2_openstack-operators(f6776d5f-3c3e-48b5-a6fd-30ff153345c2): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.223681 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podUID="f6776d5f-3c3e-48b5-a6fd-30ff153345c2" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.308090 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.308325 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52pfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-z6fb4_openstack-operators(cb79c25e-42b0-4c89-b756-89d97afeea8a): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.309509 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podUID="cb79c25e-42b0-4c89-b756-89d97afeea8a" Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.340025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" event={"ID":"806c9a5b-16ad-499d-8625-ec9124baca56","Type":"ContainerStarted","Data":"d6c07f6111f14b2bc74b0c0d20123a2604d7d3496b4ee5954d9b9cceba94addc"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.342062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" event={"ID":"b8dc2edd-3bab-4a5d-a994-ba2212e85045","Type":"ContainerStarted","Data":"6563c08ae21d9ccbf05c8d65063adcf1459dcd1b1fd937468bdf38aceb0e3812"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.343521 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" event={"ID":"86fe6ea0-2ba9-46f8-9a71-1b990d841e31","Type":"ContainerStarted","Data":"b9faf8159668a65d27b2585d0e32532631c0669ac20f4b05f8a8cd74aceaaa31"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.344819 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" event={"ID":"932b16cb-babd-4cd7-902c-03cd223e98bc","Type":"ContainerStarted","Data":"c51618fa7083569b030f6f403a68c2546c91dacaa4eda6e789afd997718f60c1"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.346337 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" event={"ID":"442a4a4d-98fb-4869-9418-7f8f3ff4644b","Type":"ContainerStarted","Data":"a9b6c81e6aedb8c5fe8ab0449d61663a0d1ecc445f482ff20cb99fbd1e115bdc"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.353667 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" event={"ID":"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51","Type":"ContainerStarted","Data":"9fb32daa228537473feb815e19c32588395eebaab91a467ecc0e111d9ac139c0"} Dec 02 07:41:46 crc kubenswrapper[4895]: I1202 07:41:46.358296 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" event={"ID":"73f50459-103c-461c-a71a-95e93d66c4c2","Type":"ContainerStarted","Data":"53639c2dc4ca1244d6350d5bee3c0ed7ec3af1b120373c4eb3db7c2038208768"} Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.434676 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.434858 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fkgk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-cp9zj_openstack-operators(4c5e704b-8d64-4341-abfe-da2df788ba5c): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.436108 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" podUID="4c5e704b-8d64-4341-abfe-da2df788ba5c" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.610660 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd\": context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.610978 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jjjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-668d9c48b9-g8gjq_openstack-operators(362234fb-b096-48e0-9be1-bed6b3e1dcf6): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd\": context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.612392 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd: Get \\\"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:46ba3f23f1d3fb1440deeb279716e4377e79e61736ec2227270349b9618a0fdd\\\": context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podUID="362234fb-b096-48e0-9be1-bed6b3e1dcf6" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.615316 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.615469 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msm2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-74cpb_openstack-operators(582c057b-7217-47bf-b2d7-f691861668c3): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.616649 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podUID="582c057b-7217-47bf-b2d7-f691861668c3" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.633078 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.633365 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ltmr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-qxbgv_openstack-operators(392adf20-0169-4258-9f4f-bb293bd5f8e8): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.634598 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podUID="392adf20-0169-4258-9f4f-bb293bd5f8e8" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.914260 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.914449 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4cwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-5xsjx_openstack-operators(170932c6-4350-4209-ba99-ff53eecd81ee): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" logger="UnhandledError" Dec 02 07:41:46 crc kubenswrapper[4895]: E1202 07:41:46.915702 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \\\"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\\\": context canceled\"]" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podUID="170932c6-4350-4209-ba99-ff53eecd81ee" Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.509405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" event={"ID":"ad99804b-869f-4b42-89ab-d29341434b61","Type":"ContainerStarted","Data":"7ed2d21f00b0d426baa597b55345844dadb33744053a1e857915db75bd8f454f"} Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.513565 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" event={"ID":"991002a6-abdd-43e4-b22d-1d95383d3b96","Type":"ContainerStarted","Data":"f1b89f006a9c063e63d6a377171d77e869512f6d454f4dcef7c597b6ec7f2bba"} Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.516500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" event={"ID":"a5a01f83-cddf-479d-b6d0-7944d70c0bdd","Type":"ContainerStarted","Data":"83a39668876e1975d3608f04034caa2465f83cf0185448731ca54daa46f6c07b"} Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.524561 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" event={"ID":"a2cb057c-0f4a-4220-8666-3ccab3458be2","Type":"ContainerStarted","Data":"b3a56ffeccabd7baaad00513c54b2eac4552d20390e67c8f0b5b1645e9106eec"} Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.527334 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" event={"ID":"932b16cb-babd-4cd7-902c-03cd223e98bc","Type":"ContainerStarted","Data":"8b94dcbc6139f4dd9fa96f2c0bfebce75238992b73870313272d84c70ade0093"} Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.527493 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.534333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" event={"ID":"f4639bb3-56b6-498a-bb7d-ab26b46fe806","Type":"ContainerStarted","Data":"92a7078eaf6f50c203a72d79fa753e788082567d4e53e61cd44071fa65441710"} Dec 02 07:41:47 crc kubenswrapper[4895]: E1202 07:41:47.537881 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" podUID="98c4ec72-ccff-439b-af96-53775411d965" Dec 02 07:41:47 crc kubenswrapper[4895]: E1202 07:41:47.549318 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podUID="cb79c25e-42b0-4c89-b756-89d97afeea8a" Dec 02 07:41:47 crc kubenswrapper[4895]: E1202 07:41:47.549380 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podUID="f6776d5f-3c3e-48b5-a6fd-30ff153345c2" Dec 02 07:41:47 crc kubenswrapper[4895]: I1202 07:41:47.562474 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" podStartSLOduration=51.562454652 podStartE2EDuration="51.562454652s" podCreationTimestamp="2025-12-02 07:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:41:47.558139899 +0000 UTC m=+1118.728999502" watchObservedRunningTime="2025-12-02 07:41:47.562454652 +0000 UTC m=+1118.733314255" Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.593310 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" event={"ID":"98c4ec72-ccff-439b-af96-53775411d965","Type":"ContainerStarted","Data":"15010a9e8ff70eb7ae85faf67574b917eede850e8032444c18c9555dd6698d17"} Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.594376 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.596166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" event={"ID":"cb79c25e-42b0-4c89-b756-89d97afeea8a","Type":"ContainerStarted","Data":"ff1cd9aae91cfdb7be0d3a436937c4f5a1c53471eb9b30fb8aff267082389c82"} Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.596505 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.597585 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" event={"ID":"f6776d5f-3c3e-48b5-a6fd-30ff153345c2","Type":"ContainerStarted","Data":"710bc0db3cbdcef1ae7542fb5b51899de9b1e4bdb0b8c7f24db3cd35d8767388"} Dec 02 07:41:48 crc kubenswrapper[4895]: I1202 07:41:48.597777 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:41:48 crc kubenswrapper[4895]: E1202 07:41:48.600079 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podUID="cb79c25e-42b0-4c89-b756-89d97afeea8a" Dec 02 07:41:48 crc kubenswrapper[4895]: E1202 07:41:48.615935 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podUID="f6776d5f-3c3e-48b5-a6fd-30ff153345c2" Dec 02 07:41:48 crc kubenswrapper[4895]: E1202 07:41:48.727152 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" podUID="98c4ec72-ccff-439b-af96-53775411d965" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.696385 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podUID="582c057b-7217-47bf-b2d7-f691861668c3" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.702250 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podUID="f6776d5f-3c3e-48b5-a6fd-30ff153345c2" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.702383 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podUID="cb79c25e-42b0-4c89-b756-89d97afeea8a" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.702480 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" podUID="98c4ec72-ccff-439b-af96-53775411d965" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.866335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" podUID="4c5e704b-8d64-4341-abfe-da2df788ba5c" Dec 02 07:41:49 crc kubenswrapper[4895]: E1202 07:41:49.894227 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podUID="392adf20-0169-4258-9f4f-bb293bd5f8e8" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.046122 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podUID="362234fb-b096-48e0-9be1-bed6b3e1dcf6" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.062167 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podUID="170932c6-4350-4209-ba99-ff53eecd81ee" Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.725959 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" event={"ID":"362234fb-b096-48e0-9be1-bed6b3e1dcf6","Type":"ContainerStarted","Data":"adb9f252842e5eb9e2f3a56c5af7ed794b91f7d26ae4f552fb8d5826292c06c4"} Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.726866 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.727835 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podUID="362234fb-b096-48e0-9be1-bed6b3e1dcf6" Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.737918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" event={"ID":"392adf20-0169-4258-9f4f-bb293bd5f8e8","Type":"ContainerStarted","Data":"e5786a7985306740e1230351e1daf3e8118165d939a5e97b77a87d07209bc91e"} Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.739060 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.740066 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podUID="392adf20-0169-4258-9f4f-bb293bd5f8e8" Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.746154 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" event={"ID":"4c5e704b-8d64-4341-abfe-da2df788ba5c","Type":"ContainerStarted","Data":"0b6ec021c3f813436ec6326d63ea6156d984b2d9bc3d1ae944280f9d12aedba9"} Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.746904 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.795783 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" podUID="4c5e704b-8d64-4341-abfe-da2df788ba5c" Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.804708 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" event={"ID":"582c057b-7217-47bf-b2d7-f691861668c3","Type":"ContainerStarted","Data":"a9234cb94bbc0130e2e7936f9edde96002ec3d6256c6ea4ca66c75a794a315f0"} Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.805137 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.809818 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podUID="582c057b-7217-47bf-b2d7-f691861668c3" Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.817424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" event={"ID":"170932c6-4350-4209-ba99-ff53eecd81ee","Type":"ContainerStarted","Data":"caeedf7af4dbe0c0e45124b10ec16d9f06beb16b491409f06ae872640fe4098f"} Dec 02 07:41:50 crc kubenswrapper[4895]: I1202 07:41:50.817786 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:41:50 crc kubenswrapper[4895]: E1202 07:41:50.819204 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podUID="170932c6-4350-4209-ba99-ff53eecd81ee" Dec 02 07:41:52 crc kubenswrapper[4895]: E1202 07:41:52.026610 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podUID="362234fb-b096-48e0-9be1-bed6b3e1dcf6" Dec 02 07:41:52 crc kubenswrapper[4895]: E1202 07:41:52.029116 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podUID="170932c6-4350-4209-ba99-ff53eecd81ee" Dec 02 07:41:52 crc kubenswrapper[4895]: E1202 07:41:52.029279 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podUID="582c057b-7217-47bf-b2d7-f691861668c3" Dec 02 07:41:52 crc kubenswrapper[4895]: E1202 07:41:52.029409 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podUID="392adf20-0169-4258-9f4f-bb293bd5f8e8" Dec 02 07:41:52 crc kubenswrapper[4895]: E1202 07:41:52.029574 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" podUID="4c5e704b-8d64-4341-abfe-da2df788ba5c" Dec 02 07:41:54 crc kubenswrapper[4895]: I1202 07:41:53.947987 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" podUID="932b16cb-babd-4cd7-902c-03cd223e98bc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.94:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:41:55 crc kubenswrapper[4895]: I1202 07:41:55.939697 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" Dec 02 07:41:55 crc kubenswrapper[4895]: E1202 07:41:55.942351 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podUID="f6776d5f-3c3e-48b5-a6fd-30ff153345c2" Dec 02 07:41:55 crc kubenswrapper[4895]: I1202 07:41:55.953205 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" Dec 02 07:41:55 crc kubenswrapper[4895]: I1202 07:41:55.953371 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" Dec 02 07:41:55 crc kubenswrapper[4895]: E1202 07:41:55.955486 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podUID="582c057b-7217-47bf-b2d7-f691861668c3" Dec 02 07:41:55 crc kubenswrapper[4895]: E1202 07:41:55.956311 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podUID="170932c6-4350-4209-ba99-ff53eecd81ee" Dec 02 07:41:55 crc kubenswrapper[4895]: I1202 07:41:55.973640 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" Dec 02 07:41:55 crc kubenswrapper[4895]: E1202 07:41:55.975840 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podUID="362234fb-b096-48e0-9be1-bed6b3e1dcf6" Dec 02 07:41:56 crc kubenswrapper[4895]: I1202 07:41:56.186702 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" Dec 02 07:41:56 crc kubenswrapper[4895]: I1202 07:41:56.192587 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" Dec 02 07:41:56 crc kubenswrapper[4895]: E1202 07:41:56.226091 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podUID="392adf20-0169-4258-9f4f-bb293bd5f8e8" Dec 02 07:41:56 crc kubenswrapper[4895]: E1202 07:41:56.226288 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podUID="cb79c25e-42b0-4c89-b756-89d97afeea8a" Dec 02 07:41:56 crc kubenswrapper[4895]: I1202 07:41:56.408027 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" Dec 02 07:41:56 crc kubenswrapper[4895]: I1202 07:41:56.571792 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" Dec 02 07:41:59 crc kubenswrapper[4895]: E1202 07:41:59.153308 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podUID="82a2bf22-3682-4982-b4fc-87ac78873cce" Dec 02 07:42:01 crc kubenswrapper[4895]: E1202 07:42:01.328653 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" podUID="2a05da0d-5cc8-4656-8cf4-96b96077d708" Dec 02 07:42:01 crc kubenswrapper[4895]: E1202 07:42:01.417302 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podUID="1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.700852 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" event={"ID":"ad99804b-869f-4b42-89ab-d29341434b61","Type":"ContainerStarted","Data":"5190d2fd89bcf00027c80bc2935e3f1d066895ee23916fcc8bc2f0458c371e2f"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.702247 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.706888 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.709147 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" event={"ID":"991002a6-abdd-43e4-b22d-1d95383d3b96","Type":"ContainerStarted","Data":"41e59df1518f79e7e9ff65f7a101059c03df23b997d488f576c1f07cb6f2b8a6"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.709413 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.715123 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.718186 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" event={"ID":"a5a01f83-cddf-479d-b6d0-7944d70c0bdd","Type":"ContainerStarted","Data":"81b0475fce6cb7db42a9a2aeac618d55753671c71bbf8e72c7e965986c12897f"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.718347 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.721465 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.721934 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" event={"ID":"2a05da0d-5cc8-4656-8cf4-96b96077d708","Type":"ContainerStarted","Data":"330a2081c04ebbe8d8a876004de450f828a97bbdb4651a34f676011febfd7bac"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.727059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" event={"ID":"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c","Type":"ContainerStarted","Data":"71d4bb901a6baa5e9e8b51f1af6bbaf92f4b6be703a353a331896aa79babcbc6"} Dec 02 07:42:01 crc kubenswrapper[4895]: E1202 07:42:01.729290 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podUID="1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.729975 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-hcnp5" podStartSLOduration=5.197973882 podStartE2EDuration="1m6.729963944s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.204408192 +0000 UTC m=+1070.375267805" lastFinishedPulling="2025-12-02 07:42:00.736398254 +0000 UTC m=+1131.907257867" observedRunningTime="2025-12-02 07:42:01.727819168 +0000 UTC m=+1132.898678781" watchObservedRunningTime="2025-12-02 07:42:01.729963944 +0000 UTC m=+1132.900823557" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.733519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" event={"ID":"86fe6ea0-2ba9-46f8-9a71-1b990d841e31","Type":"ContainerStarted","Data":"c4cad083deebf382af5a9c9f5d2151ca9914d94f276e502618d86dc89dcdc994"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.733653 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.741994 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" event={"ID":"ca1b5423-1f2c-4b12-9ae9-f65bb5301c51","Type":"ContainerStarted","Data":"07894a3b4fd3027b25ec3559ba1ef816b96b2b69b12566d06ffa49a4438166f7"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.742258 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.745176 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.746226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" event={"ID":"4c5e704b-8d64-4341-abfe-da2df788ba5c","Type":"ContainerStarted","Data":"892bd304308573015cc3cb73904066b4457dce4a1fbc0720e74d07f54f966355"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.752798 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" event={"ID":"a2cb057c-0f4a-4220-8666-3ccab3458be2","Type":"ContainerStarted","Data":"b32c72f7d429caa87dc6e9374d9da557e9c319a2226d23af514b2bec6d8c656d"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.753467 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.755441 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.756791 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" event={"ID":"b8dc2edd-3bab-4a5d-a994-ba2212e85045","Type":"ContainerStarted","Data":"6e4272d057ed8cb4e7be61ae7a2546cb026778748e819ddbd06e6b6c58b14a49"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.756861 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.773094 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" event={"ID":"98c4ec72-ccff-439b-af96-53775411d965","Type":"ContainerStarted","Data":"48d2ee6d726d7cbcf56d32d6ecf40c38741ffd35fefbcb8a0693ec77add3ba8e"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.779498 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" event={"ID":"f4639bb3-56b6-498a-bb7d-ab26b46fe806","Type":"ContainerStarted","Data":"059c7e2422be1932bf4725c3e6f5697ba2bcf3ac2384cea3930aff86ba3b0c28"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.781019 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.786953 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.802069 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" event={"ID":"73f50459-103c-461c-a71a-95e93d66c4c2","Type":"ContainerStarted","Data":"40d7e6d7aae3404bdd6c30a76ce214f1a9851f8739c57a76868548d6868949ce"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.803121 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.806516 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.818289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" event={"ID":"806c9a5b-16ad-499d-8625-ec9124baca56","Type":"ContainerStarted","Data":"e21b9542a697667573923bc9bb0035d413b1de2a1183658bf0b9464832dede9f"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.819642 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.827400 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.829495 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" event={"ID":"442a4a4d-98fb-4869-9418-7f8f3ff4644b","Type":"ContainerStarted","Data":"26d1bd606464bc870815d6f5fe20554ff555fdc8fb003eedd425c55849605745"} Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.830080 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.860475 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.887633 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-6x68d" podStartSLOduration=5.239302876 podStartE2EDuration="1m6.887611361s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.178999376 +0000 UTC m=+1070.349858979" lastFinishedPulling="2025-12-02 07:42:00.827307861 +0000 UTC m=+1131.998167464" observedRunningTime="2025-12-02 07:42:01.860860656 +0000 UTC m=+1133.031720289" watchObservedRunningTime="2025-12-02 07:42:01.887611361 +0000 UTC m=+1133.058470974" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.888582 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q6n97" podStartSLOduration=5.368330562 podStartE2EDuration="1m6.888576651s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.209107928 +0000 UTC m=+1070.379967541" lastFinishedPulling="2025-12-02 07:42:00.729354017 +0000 UTC m=+1131.900213630" observedRunningTime="2025-12-02 07:42:01.887082394 +0000 UTC m=+1133.057942027" watchObservedRunningTime="2025-12-02 07:42:01.888576651 +0000 UTC m=+1133.059436264" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.924379 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-2mrjr" podStartSLOduration=5.286940015 podStartE2EDuration="1m6.924354575s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.160235585 +0000 UTC m=+1070.331095188" lastFinishedPulling="2025-12-02 07:42:00.797650135 +0000 UTC m=+1131.968509748" observedRunningTime="2025-12-02 07:42:01.917305497 +0000 UTC m=+1133.088165110" watchObservedRunningTime="2025-12-02 07:42:01.924354575 +0000 UTC m=+1133.095214188" Dec 02 07:42:01 crc kubenswrapper[4895]: I1202 07:42:01.951307 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-d5lnh" podStartSLOduration=5.111406076 podStartE2EDuration="1m6.951285616s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.16266677 +0000 UTC m=+1070.333526383" lastFinishedPulling="2025-12-02 07:42:01.00254631 +0000 UTC m=+1132.173405923" observedRunningTime="2025-12-02 07:42:01.946199729 +0000 UTC m=+1133.117059352" watchObservedRunningTime="2025-12-02 07:42:01.951285616 +0000 UTC m=+1133.122145229" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.016574 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-sbrp2" podStartSLOduration=4.968387702 podStartE2EDuration="1m7.016549291s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:58.673194551 +0000 UTC m=+1069.844054164" lastFinishedPulling="2025-12-02 07:42:00.7213561 +0000 UTC m=+1131.892215753" observedRunningTime="2025-12-02 07:42:02.015151298 +0000 UTC m=+1133.186010901" watchObservedRunningTime="2025-12-02 07:42:02.016549291 +0000 UTC m=+1133.187408904" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.030226 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w52df" podStartSLOduration=4.968006888 podStartE2EDuration="1m7.030204212s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:58.647847807 +0000 UTC m=+1069.818707420" lastFinishedPulling="2025-12-02 07:42:00.710045121 +0000 UTC m=+1131.880904744" observedRunningTime="2025-12-02 07:42:01.981038845 +0000 UTC m=+1133.151898448" watchObservedRunningTime="2025-12-02 07:42:02.030204212 +0000 UTC m=+1133.201063825" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.067378 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" podStartSLOduration=51.778731139 podStartE2EDuration="1m7.067352879s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:41:45.405669974 +0000 UTC m=+1116.576529587" lastFinishedPulling="2025-12-02 07:42:00.694291674 +0000 UTC m=+1131.865151327" observedRunningTime="2025-12-02 07:42:02.0592589 +0000 UTC m=+1133.230118533" watchObservedRunningTime="2025-12-02 07:42:02.067352879 +0000 UTC m=+1133.238212492" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.106431 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-mgtj6" podStartSLOduration=4.926048891 podStartE2EDuration="1m7.106408105s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:58.593163876 +0000 UTC m=+1069.764023489" lastFinishedPulling="2025-12-02 07:42:00.7735231 +0000 UTC m=+1131.944382703" observedRunningTime="2025-12-02 07:42:02.103354041 +0000 UTC m=+1133.274213654" watchObservedRunningTime="2025-12-02 07:42:02.106408105 +0000 UTC m=+1133.277267718" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.134634 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9zczx" podStartSLOduration=5.300138612 podStartE2EDuration="1m7.134608376s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.130848527 +0000 UTC m=+1070.301708140" lastFinishedPulling="2025-12-02 07:42:00.965318291 +0000 UTC m=+1132.136177904" observedRunningTime="2025-12-02 07:42:02.13183457 +0000 UTC m=+1133.302694183" watchObservedRunningTime="2025-12-02 07:42:02.134608376 +0000 UTC m=+1133.305467989" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.160196 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4642q" podStartSLOduration=5.636914991 podStartE2EDuration="1m7.160168185s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.184990711 +0000 UTC m=+1070.355850314" lastFinishedPulling="2025-12-02 07:42:00.708243855 +0000 UTC m=+1131.879103508" observedRunningTime="2025-12-02 07:42:02.155367136 +0000 UTC m=+1133.326226749" watchObservedRunningTime="2025-12-02 07:42:02.160168185 +0000 UTC m=+1133.331027798" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.232490 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-cp9zj" podStartSLOduration=5.461511385 podStartE2EDuration="1m7.232451906s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.025971372 +0000 UTC m=+1070.196830985" lastFinishedPulling="2025-12-02 07:42:00.796911893 +0000 UTC m=+1131.967771506" observedRunningTime="2025-12-02 07:42:02.188153549 +0000 UTC m=+1133.359013162" watchObservedRunningTime="2025-12-02 07:42:02.232451906 +0000 UTC m=+1133.403311519" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.292065 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" podStartSLOduration=52.722813403 podStartE2EDuration="1m7.292037095s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:41:46.124816274 +0000 UTC m=+1117.295675887" lastFinishedPulling="2025-12-02 07:42:00.694039916 +0000 UTC m=+1131.864899579" observedRunningTime="2025-12-02 07:42:02.215720719 +0000 UTC m=+1133.386580352" watchObservedRunningTime="2025-12-02 07:42:02.292037095 +0000 UTC m=+1133.462896708" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.843134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" event={"ID":"b8dc2edd-3bab-4a5d-a994-ba2212e85045","Type":"ContainerStarted","Data":"250e096821e323de771c381e4d7173e8b79d749f1887dba186b0aaa0b0fdb681"} Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.847520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" event={"ID":"2a05da0d-5cc8-4656-8cf4-96b96077d708","Type":"ContainerStarted","Data":"0908bcfcf9abffcfa5e1f62e5dc7f9929a2d698c71370663924c2600b2b89fdb"} Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.849268 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.853258 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" event={"ID":"86fe6ea0-2ba9-46f8-9a71-1b990d841e31","Type":"ContainerStarted","Data":"cff92c4d637ee7332c9a06108446203ca1ca2aad404ad33cd714a0f2da9c25e9"} Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.882153 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6979866f9f-j56w9" Dec 02 07:42:02 crc kubenswrapper[4895]: I1202 07:42:02.884956 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" podStartSLOduration=4.450310253 podStartE2EDuration="1m7.884936387s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.054959989 +0000 UTC m=+1070.225819602" lastFinishedPulling="2025-12-02 07:42:02.489586123 +0000 UTC m=+1133.660445736" observedRunningTime="2025-12-02 07:42:02.876324631 +0000 UTC m=+1134.047184244" watchObservedRunningTime="2025-12-02 07:42:02.884936387 +0000 UTC m=+1134.055796000" Dec 02 07:42:05 crc kubenswrapper[4895]: I1202 07:42:05.473388 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:42:05 crc kubenswrapper[4895]: I1202 07:42:05.473983 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.907302 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" event={"ID":"392adf20-0169-4258-9f4f-bb293bd5f8e8","Type":"ContainerStarted","Data":"86603252ca2bbbbf759554d0ec8d2d62ec0c6ae79150eb4fa31f61950b3a4edf"} Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.908679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" event={"ID":"f6776d5f-3c3e-48b5-a6fd-30ff153345c2","Type":"ContainerStarted","Data":"ccd6640f3e53467e12fdc3ec3824832914f354ce4b7e5af02f8d2ccbfd4906fd"} Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.910424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" event={"ID":"170932c6-4350-4209-ba99-ff53eecd81ee","Type":"ContainerStarted","Data":"c130b1ac3b2b58a919d370345920a448027e5fed1e2814a52ee97e5ab44eefb4"} Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.912124 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" event={"ID":"362234fb-b096-48e0-9be1-bed6b3e1dcf6","Type":"ContainerStarted","Data":"e16f1a5769e1e883001da68787aa1c69db3f5f2dddbb553a6b9532613e2d4f55"} Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.929642 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-qxbgv" podStartSLOduration=24.396995712 podStartE2EDuration="1m13.929615499s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.194441924 +0000 UTC m=+1070.365301537" lastFinishedPulling="2025-12-02 07:41:48.727061701 +0000 UTC m=+1119.897921324" observedRunningTime="2025-12-02 07:42:08.923999406 +0000 UTC m=+1140.094859029" watchObservedRunningTime="2025-12-02 07:42:08.929615499 +0000 UTC m=+1140.100475132" Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.954881 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-5xsjx" podStartSLOduration=23.743891089999998 podStartE2EDuration="1m13.954855958s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:58.714309242 +0000 UTC m=+1069.885168855" lastFinishedPulling="2025-12-02 07:41:48.92527411 +0000 UTC m=+1120.096133723" observedRunningTime="2025-12-02 07:42:08.948905294 +0000 UTC m=+1140.119764927" watchObservedRunningTime="2025-12-02 07:42:08.954855958 +0000 UTC m=+1140.125715581" Dec 02 07:42:08 crc kubenswrapper[4895]: I1202 07:42:08.979076 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-m8cp2" podStartSLOduration=25.874888793 podStartE2EDuration="1m13.979049195s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.04107019 +0000 UTC m=+1070.211929803" lastFinishedPulling="2025-12-02 07:41:47.145230592 +0000 UTC m=+1118.316090205" observedRunningTime="2025-12-02 07:42:08.975090692 +0000 UTC m=+1140.145950315" watchObservedRunningTime="2025-12-02 07:42:08.979049195 +0000 UTC m=+1140.149908808" Dec 02 07:42:09 crc kubenswrapper[4895]: I1202 07:42:09.014686 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8gjq" podStartSLOduration=24.283034234 podStartE2EDuration="1m14.014666794s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.04138178 +0000 UTC m=+1070.212241393" lastFinishedPulling="2025-12-02 07:41:48.77301434 +0000 UTC m=+1119.943873953" observedRunningTime="2025-12-02 07:42:09.012300651 +0000 UTC m=+1140.183160264" watchObservedRunningTime="2025-12-02 07:42:09.014666794 +0000 UTC m=+1140.185526407" Dec 02 07:42:10 crc kubenswrapper[4895]: I1202 07:42:10.933771 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" event={"ID":"582c057b-7217-47bf-b2d7-f691861668c3","Type":"ContainerStarted","Data":"ae5a8eadb3f70f05f07d9a714082f79e08d4ab0150b55d379034b67f63605109"} Dec 02 07:42:10 crc kubenswrapper[4895]: I1202 07:42:10.964098 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-74cpb" podStartSLOduration=26.3068063 podStartE2EDuration="1m15.96406716s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.115366678 +0000 UTC m=+1070.286226291" lastFinishedPulling="2025-12-02 07:41:48.772627528 +0000 UTC m=+1119.943487151" observedRunningTime="2025-12-02 07:42:10.95985497 +0000 UTC m=+1142.130714643" watchObservedRunningTime="2025-12-02 07:42:10.96406716 +0000 UTC m=+1142.134926813" Dec 02 07:42:11 crc kubenswrapper[4895]: I1202 07:42:11.889512 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-wmsb6" Dec 02 07:42:12 crc kubenswrapper[4895]: I1202 07:42:12.162700 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4c299z" Dec 02 07:42:12 crc kubenswrapper[4895]: I1202 07:42:12.980303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" event={"ID":"cb79c25e-42b0-4c89-b756-89d97afeea8a","Type":"ContainerStarted","Data":"2184e13f8beb834f4d58cc3201677aa06c0e78394a1924be680cbb0cbfbb4ca0"} Dec 02 07:42:13 crc kubenswrapper[4895]: I1202 07:42:13.012491 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-z6fb4" podStartSLOduration=29.566585514 podStartE2EDuration="1m18.012469401s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:58.627308042 +0000 UTC m=+1069.798167655" lastFinishedPulling="2025-12-02 07:41:47.073191929 +0000 UTC m=+1118.244051542" observedRunningTime="2025-12-02 07:42:13.004185805 +0000 UTC m=+1144.175045418" watchObservedRunningTime="2025-12-02 07:42:13.012469401 +0000 UTC m=+1144.183329014" Dec 02 07:42:13 crc kubenswrapper[4895]: I1202 07:42:13.992706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" event={"ID":"82a2bf22-3682-4982-b4fc-87ac78873cce","Type":"ContainerStarted","Data":"15ddea1576cdb3159271ebcc045b2ff35ca311505070af82f1831b57a053bb85"} Dec 02 07:42:14 crc kubenswrapper[4895]: I1202 07:42:14.010941 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bhzzl" podStartSLOduration=3.582138522 podStartE2EDuration="1m18.010915143s" podCreationTimestamp="2025-12-02 07:40:56 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.188615993 +0000 UTC m=+1070.359475606" lastFinishedPulling="2025-12-02 07:42:13.617392614 +0000 UTC m=+1144.788252227" observedRunningTime="2025-12-02 07:42:14.009568911 +0000 UTC m=+1145.180428554" watchObservedRunningTime="2025-12-02 07:42:14.010915143 +0000 UTC m=+1145.181774796" Dec 02 07:42:15 crc kubenswrapper[4895]: I1202 07:42:15.002443 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" event={"ID":"1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c","Type":"ContainerStarted","Data":"de06215f5c7eb9edd448ab5e0a5e3ad16acc87de2816edcd2200694d7331eb63"} Dec 02 07:42:15 crc kubenswrapper[4895]: I1202 07:42:15.003374 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:42:15 crc kubenswrapper[4895]: I1202 07:42:15.027806 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" podStartSLOduration=4.489186751 podStartE2EDuration="1m20.027778161s" podCreationTimestamp="2025-12-02 07:40:55 +0000 UTC" firstStartedPulling="2025-12-02 07:40:59.186705375 +0000 UTC m=+1070.357564988" lastFinishedPulling="2025-12-02 07:42:14.725296785 +0000 UTC m=+1145.896156398" observedRunningTime="2025-12-02 07:42:15.021678564 +0000 UTC m=+1146.192538197" watchObservedRunningTime="2025-12-02 07:42:15.027778161 +0000 UTC m=+1146.198637784" Dec 02 07:42:16 crc kubenswrapper[4895]: I1202 07:42:16.341858 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4hpbt" Dec 02 07:42:26 crc kubenswrapper[4895]: I1202 07:42:26.944864 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-rqm56" Dec 02 07:42:35 crc kubenswrapper[4895]: I1202 07:42:35.473885 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:42:35 crc kubenswrapper[4895]: I1202 07:42:35.474570 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:42:35 crc kubenswrapper[4895]: I1202 07:42:35.474628 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:42:35 crc kubenswrapper[4895]: I1202 07:42:35.475406 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:42:35 crc kubenswrapper[4895]: I1202 07:42:35.475466 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740" gracePeriod=600 Dec 02 07:42:37 crc kubenswrapper[4895]: I1202 07:42:37.232455 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740" exitCode=0 Dec 02 07:42:37 crc kubenswrapper[4895]: I1202 07:42:37.232546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740"} Dec 02 07:42:37 crc kubenswrapper[4895]: I1202 07:42:37.233058 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af"} Dec 02 07:42:37 crc kubenswrapper[4895]: I1202 07:42:37.233086 4895 scope.go:117] "RemoveContainer" containerID="12a9227e27ad8d7bc29431661ef9209e2bb61dd12d583d4b2e7609ed8ada972b" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.753001 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.754840 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.758072 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.758897 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ks72r" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.759000 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.759332 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.769187 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.845425 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.847963 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.849824 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.865888 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.898054 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.898121 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.898155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.898230 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhxj\" (UniqueName: \"kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:41 crc kubenswrapper[4895]: I1202 07:42:41.898295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72shz\" (UniqueName: \"kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:41.999999 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhxj\" (UniqueName: \"kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.000084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72shz\" (UniqueName: \"kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.000217 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.000289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.000316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.001235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.001270 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.001829 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.025250 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhxj\" (UniqueName: \"kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj\") pod \"dnsmasq-dns-78dd6ddcc-rn4nh\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.027499 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72shz\" (UniqueName: \"kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz\") pod \"dnsmasq-dns-675f4bcbfc-4l528\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.070850 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.169445 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.545114 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:42:42 crc kubenswrapper[4895]: W1202 07:42:42.710620 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d3a26e_3783_4887_bd28_cd19c12a3111.slice/crio-20c8184c2aeff6fd413ee05613d69720967a50d26ecf971b003f988f02b73088 WatchSource:0}: Error finding container 20c8184c2aeff6fd413ee05613d69720967a50d26ecf971b003f988f02b73088: Status 404 returned error can't find the container with id 20c8184c2aeff6fd413ee05613d69720967a50d26ecf971b003f988f02b73088 Dec 02 07:42:42 crc kubenswrapper[4895]: I1202 07:42:42.711561 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.301539 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" event={"ID":"f2d3a26e-3783-4887-bd28-cd19c12a3111","Type":"ContainerStarted","Data":"20c8184c2aeff6fd413ee05613d69720967a50d26ecf971b003f988f02b73088"} Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.304492 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" event={"ID":"a6eb81c1-35cc-4330-9889-e37a864f1217","Type":"ContainerStarted","Data":"94bec48759c51a9c5112d163ba3e367424c7c2c0e75ee8850fd4143e756e7f7e"} Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.881547 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.932157 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.934462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:43 crc kubenswrapper[4895]: I1202 07:42:43.947062 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.042824 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.043062 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wklbl\" (UniqueName: \"kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.043135 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.145312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.145455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wklbl\" (UniqueName: \"kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.145482 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.150006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.151321 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.180541 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wklbl\" (UniqueName: \"kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl\") pod \"dnsmasq-dns-666b6646f7-lvv8z\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.224862 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.262149 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.263770 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.280564 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.286255 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.451906 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.452029 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.452093 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.558015 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.560890 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.561063 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.562494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.564272 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.580038 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w\") pod \"dnsmasq-dns-57d769cc4f-l7vgw\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.586512 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:42:44 crc kubenswrapper[4895]: I1202 07:42:44.978049 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:42:44 crc kubenswrapper[4895]: W1202 07:42:44.988410 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b019ba5_e2aa_4ac7_a3a6_02e619195935.slice/crio-d67570f5f95063dfacda7f2e25feeef193c2f24ad14f5df53e751ec87d66a9e2 WatchSource:0}: Error finding container d67570f5f95063dfacda7f2e25feeef193c2f24ad14f5df53e751ec87d66a9e2: Status 404 returned error can't find the container with id d67570f5f95063dfacda7f2e25feeef193c2f24ad14f5df53e751ec87d66a9e2 Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.061345 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.063529 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.066468 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.066522 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bsdnm" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.067147 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.067261 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.067349 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.068053 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.069421 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.079286 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.173982 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185560 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185610 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185663 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185702 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185721 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185795 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r2n\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.185871 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: W1202 07:42:45.208398 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56de784f_037a_486b_8e7b_51d62e0dd0ad.slice/crio-c845940080c320a0eb0dfbc9a27798ca2ffcd7bdc550cdbbfff25f82359a77ee WatchSource:0}: Error finding container c845940080c320a0eb0dfbc9a27798ca2ffcd7bdc550cdbbfff25f82359a77ee: Status 404 returned error can't find the container with id c845940080c320a0eb0dfbc9a27798ca2ffcd7bdc550cdbbfff25f82359a77ee Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287239 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287396 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287464 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287577 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287602 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r2n\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287635 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.287660 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.288659 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.288733 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.289150 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.289808 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.290776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.292582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.299589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.300290 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.307254 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.310765 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.311795 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r2n\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.314944 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.331956 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" event={"ID":"3b019ba5-e2aa-4ac7-a3a6-02e619195935","Type":"ContainerStarted","Data":"d67570f5f95063dfacda7f2e25feeef193c2f24ad14f5df53e751ec87d66a9e2"} Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.334827 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" event={"ID":"56de784f-037a-486b-8e7b-51d62e0dd0ad","Type":"ContainerStarted","Data":"c845940080c320a0eb0dfbc9a27798ca2ffcd7bdc550cdbbfff25f82359a77ee"} Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.400517 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.404060 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.407702 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w9t4w" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.408084 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.408329 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.408491 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.408644 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.410455 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.422635 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.428241 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.436734 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512201 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512309 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512333 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512363 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512400 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512419 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512437 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhg9\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512470 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.512488 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614600 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614630 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhg9\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614667 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614687 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614732 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614780 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614843 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.614874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.615832 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.616407 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.617133 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.617934 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.617925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.618580 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.630794 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.631817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.634190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.645520 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhg9\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.648994 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.698857 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:45 crc kubenswrapper[4895]: I1202 07:42:45.747879 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.081208 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:42:46 crc kubenswrapper[4895]: W1202 07:42:46.099830 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca98cba7_4127_4d25_a139_1a42224331f2.slice/crio-15a095a70eb867e75188aa85a8bc8725e7974c8213e3fee9e754f1ad56e47533 WatchSource:0}: Error finding container 15a095a70eb867e75188aa85a8bc8725e7974c8213e3fee9e754f1ad56e47533: Status 404 returned error can't find the container with id 15a095a70eb867e75188aa85a8bc8725e7974c8213e3fee9e754f1ad56e47533 Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.363048 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerStarted","Data":"15a095a70eb867e75188aa85a8bc8725e7974c8213e3fee9e754f1ad56e47533"} Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.369091 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:42:46 crc kubenswrapper[4895]: W1202 07:42:46.408728 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1cb194_5325_40c2_bbd4_0a48821e12aa.slice/crio-698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb WatchSource:0}: Error finding container 698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb: Status 404 returned error can't find the container with id 698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.768654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.770697 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.774888 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jskc8" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.775067 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.791729 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.791726 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.793465 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.798661 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.903391 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.903461 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5z9q\" (UniqueName: \"kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.903494 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.903522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.903543 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.904571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.904610 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:46 crc kubenswrapper[4895]: I1202 07:42:46.904641 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006105 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5z9q\" (UniqueName: \"kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006166 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006197 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006216 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006246 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.006338 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.007074 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.007376 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.007452 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.008191 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.008808 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.017914 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.033286 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.039345 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.129923 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5z9q\" (UniqueName: \"kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q\") pod \"openstack-galera-0\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.263840 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 07:42:47 crc kubenswrapper[4895]: I1202 07:42:47.385543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerStarted","Data":"698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb"} Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.368513 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.370381 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.382021 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c4ffk" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.396874 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.441676 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.442728 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.445814 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.550440 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.550940 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551057 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551131 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551171 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q684\" (UniqueName: \"kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.551233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.552889 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.558463 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.559173 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xggvv" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.559591 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.573108 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655660 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmp2\" (UniqueName: \"kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655715 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655752 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655786 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q684\" (UniqueName: \"kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655835 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655887 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655904 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655960 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.655989 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.656010 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.656429 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.662299 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.668412 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.679600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.687873 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.739511 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.741242 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.742016 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q684\" (UniqueName: \"kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.757406 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.757474 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmp2\" (UniqueName: \"kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.757543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.757596 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.757644 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.758521 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.759729 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.772624 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.789095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.799508 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.830553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmp2\" (UniqueName: \"kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2\") pod \"memcached-0\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " pod="openstack/memcached-0" Dec 02 07:42:48 crc kubenswrapper[4895]: I1202 07:42:48.931244 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 07:42:49 crc kubenswrapper[4895]: I1202 07:42:49.087382 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:42:49 crc kubenswrapper[4895]: I1202 07:42:49.088308 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 07:42:49 crc kubenswrapper[4895]: I1202 07:42:49.471371 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerStarted","Data":"0be7fbe507321827b307a582000ca34981fa4418347f3ceed7cd877618c413d3"} Dec 02 07:42:49 crc kubenswrapper[4895]: I1202 07:42:49.865347 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.194122 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.552398 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.553542 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.558833 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-6zjbc" Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.560015 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.583250 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zs8\" (UniqueName: \"kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8\") pod \"kube-state-metrics-0\" (UID: \"a6606596-020b-4584-b9d2-8606a794a726\") " pod="openstack/kube-state-metrics-0" Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.685633 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zs8\" (UniqueName: \"kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8\") pod \"kube-state-metrics-0\" (UID: \"a6606596-020b-4584-b9d2-8606a794a726\") " pod="openstack/kube-state-metrics-0" Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.749331 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zs8\" (UniqueName: \"kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8\") pod \"kube-state-metrics-0\" (UID: \"a6606596-020b-4584-b9d2-8606a794a726\") " pod="openstack/kube-state-metrics-0" Dec 02 07:42:50 crc kubenswrapper[4895]: I1202 07:42:50.915717 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.136219 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.241764 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.245890 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.246060 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.246129 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.250298 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.250667 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qrhzh" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.267912 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352077 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352138 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352178 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352208 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352237 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9v7k\" (UniqueName: \"kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.352796 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.449818 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.451866 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.453693 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-f7d2d" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457032 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457325 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457323 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457477 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457531 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457557 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457599 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.457634 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.458980 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.459034 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.459152 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9v7k\" (UniqueName: \"kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.459195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.459631 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.462313 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.464627 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.472213 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.476347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.476627 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.487760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.499782 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9v7k\" (UniqueName: \"kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.560909 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.560945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8fl7\" (UniqueName: \"kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.560968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.560986 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vwv6\" (UniqueName: \"kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561069 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561150 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.561209 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.566904 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.633430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662504 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662721 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8fl7\" (UniqueName: \"kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662762 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662780 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662834 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vwv6\" (UniqueName: \"kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662854 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662924 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662941 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662956 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.662994 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663156 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663326 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663464 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663508 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663665 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.663837 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.668620 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.669833 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.671274 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.672817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.684802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.734954 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8fl7\" (UniqueName: \"kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7\") pod \"ovn-controller-ftfwq\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.737106 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vwv6\" (UniqueName: \"kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6\") pod \"ovn-controller-ovs-9vczq\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.771029 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.868758 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:42:54 crc kubenswrapper[4895]: I1202 07:42:54.889086 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 07:42:57 crc kubenswrapper[4895]: W1202 07:42:57.334422 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb15097a8_ac9a_4886_a839_272b662561c5.slice/crio-d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67 WatchSource:0}: Error finding container d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67: Status 404 returned error can't find the container with id d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67 Dec 02 07:42:57 crc kubenswrapper[4895]: W1202 07:42:57.338527 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace60b46_ed73_43ba_8d95_b81b03a6bd0a.slice/crio-ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93 WatchSource:0}: Error finding container ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93: Status 404 returned error can't find the container with id ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93 Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.681030 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b15097a8-ac9a-4886-a839-272b662561c5","Type":"ContainerStarted","Data":"d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67"} Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.683483 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerStarted","Data":"ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93"} Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.987825 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.989437 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.991713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m6nxj" Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.991761 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.992109 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 07:42:57 crc kubenswrapper[4895]: I1202 07:42:57.992426 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.013398 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084267 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084482 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084550 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084645 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084711 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084817 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dp7\" (UniqueName: \"kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.084866 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.186813 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79dp7\" (UniqueName: \"kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.186866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.186910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.186940 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.186990 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.187019 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.187048 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.187084 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.187593 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.189264 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.189776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.190022 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.194664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.196981 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.200335 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.205824 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dp7\" (UniqueName: \"kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.210527 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " pod="openstack/ovsdbserver-sb-0" Dec 02 07:42:58 crc kubenswrapper[4895]: I1202 07:42:58.330692 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 07:43:17 crc kubenswrapper[4895]: E1202 07:43:17.730474 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 02 07:43:17 crc kubenswrapper[4895]: E1202 07:43:17.731343 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5z9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(38385316-fca8-41b0-b0ff-570a9cd71e8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:17 crc kubenswrapper[4895]: E1202 07:43:17.733961 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" Dec 02 07:43:18 crc kubenswrapper[4895]: E1202 07:43:18.041184 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.479699 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.480009 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.480370 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjhg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0d1cb194-5325-40c2-bbd4-0a48821e12aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.481390 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9r2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(ca98cba7-4127-4d25-a139-1a42224331f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.481476 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" Dec 02 07:43:20 crc kubenswrapper[4895]: E1202 07:43:20.482813 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" Dec 02 07:43:21 crc kubenswrapper[4895]: E1202 07:43:21.179674 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" Dec 02 07:43:21 crc kubenswrapper[4895]: E1202 07:43:21.179987 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" Dec 02 07:43:21 crc kubenswrapper[4895]: E1202 07:43:21.675109 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 02 07:43:21 crc kubenswrapper[4895]: E1202 07:43:21.675319 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n65ch5b8hddh586hdfhfch66fh7bh9bh579h76hb7h5f5h687h57bh65bhfbh98hf6h69h666h558h67dh688h5c6h544h64dhb7h5ddh5fdh658h4q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcmp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(b15097a8-ac9a-4886-a839-272b662561c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:21 crc kubenswrapper[4895]: E1202 07:43:21.676680 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="b15097a8-ac9a-4886-a839-272b662561c5" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.162273 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="b15097a8-ac9a-4886-a839-272b662561c5" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.439270 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.440261 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wklbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-lvv8z_openstack(3b019ba5-e2aa-4ac7-a3a6-02e619195935): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.442069 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.459258 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.459452 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcj6w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-l7vgw_openstack(56de784f-037a-486b-8e7b-51d62e0dd0ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.461072 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" podUID="56de784f-037a-486b-8e7b-51d62e0dd0ad" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.469088 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.469326 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-72shz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4l528_openstack(a6eb81c1-35cc-4330-9889-e37a864f1217): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.470925 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" podUID="a6eb81c1-35cc-4330-9889-e37a864f1217" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.495435 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.495682 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xhxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rn4nh_openstack(f2d3a26e-3783-4887-bd28-cd19c12a3111): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:43:22 crc kubenswrapper[4895]: E1202 07:43:22.497047 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" podUID="f2d3a26e-3783-4887-bd28-cd19c12a3111" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.175251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerStarted","Data":"7e1fc19a4fb8bbfeda2cc4b937706c5f0d0cf2fabcee3636fcd1a49acddeb02d"} Dec 02 07:43:23 crc kubenswrapper[4895]: E1202 07:43:23.177592 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" Dec 02 07:43:23 crc kubenswrapper[4895]: E1202 07:43:23.177641 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" podUID="56de784f-037a-486b-8e7b-51d62e0dd0ad" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.381051 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.574155 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.583078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.601239 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:43:23 crc kubenswrapper[4895]: W1202 07:43:23.604416 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ac7640_b11c_48f4_b537_45bebe4af01b.slice/crio-9898ced809f4f02ded24f28135de90d0170c28170f7395759aab814117ee8368 WatchSource:0}: Error finding container 9898ced809f4f02ded24f28135de90d0170c28170f7395759aab814117ee8368: Status 404 returned error can't find the container with id 9898ced809f4f02ded24f28135de90d0170c28170f7395759aab814117ee8368 Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.723462 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.762202 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.812371 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc\") pod \"f2d3a26e-3783-4887-bd28-cd19c12a3111\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.812792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config\") pod \"f2d3a26e-3783-4887-bd28-cd19c12a3111\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.812907 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhxj\" (UniqueName: \"kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj\") pod \"f2d3a26e-3783-4887-bd28-cd19c12a3111\" (UID: \"f2d3a26e-3783-4887-bd28-cd19c12a3111\") " Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.814160 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2d3a26e-3783-4887-bd28-cd19c12a3111" (UID: "f2d3a26e-3783-4887-bd28-cd19c12a3111"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.814594 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config" (OuterVolumeSpecName: "config") pod "f2d3a26e-3783-4887-bd28-cd19c12a3111" (UID: "f2d3a26e-3783-4887-bd28-cd19c12a3111"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.819279 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj" (OuterVolumeSpecName: "kube-api-access-7xhxj") pod "f2d3a26e-3783-4887-bd28-cd19c12a3111" (UID: "f2d3a26e-3783-4887-bd28-cd19c12a3111"). InnerVolumeSpecName "kube-api-access-7xhxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.914579 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72shz\" (UniqueName: \"kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz\") pod \"a6eb81c1-35cc-4330-9889-e37a864f1217\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.914947 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config\") pod \"a6eb81c1-35cc-4330-9889-e37a864f1217\" (UID: \"a6eb81c1-35cc-4330-9889-e37a864f1217\") " Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.915538 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.915628 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d3a26e-3783-4887-bd28-cd19c12a3111-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.915733 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhxj\" (UniqueName: \"kubernetes.io/projected/f2d3a26e-3783-4887-bd28-cd19c12a3111-kube-api-access-7xhxj\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.915803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config" (OuterVolumeSpecName: "config") pod "a6eb81c1-35cc-4330-9889-e37a864f1217" (UID: "a6eb81c1-35cc-4330-9889-e37a864f1217"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:23 crc kubenswrapper[4895]: I1202 07:43:23.919216 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz" (OuterVolumeSpecName: "kube-api-access-72shz") pod "a6eb81c1-35cc-4330-9889-e37a864f1217" (UID: "a6eb81c1-35cc-4330-9889-e37a864f1217"). InnerVolumeSpecName "kube-api-access-72shz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.017658 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6eb81c1-35cc-4330-9889-e37a864f1217-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.017698 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72shz\" (UniqueName: \"kubernetes.io/projected/a6eb81c1-35cc-4330-9889-e37a864f1217-kube-api-access-72shz\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.197449 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.197452 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4l528" event={"ID":"a6eb81c1-35cc-4330-9889-e37a864f1217","Type":"ContainerDied","Data":"94bec48759c51a9c5112d163ba3e367424c7c2c0e75ee8850fd4143e756e7f7e"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.206011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" event={"ID":"f2d3a26e-3783-4887-bd28-cd19c12a3111","Type":"ContainerDied","Data":"20c8184c2aeff6fd413ee05613d69720967a50d26ecf971b003f988f02b73088"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.206093 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rn4nh" Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.210051 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerStarted","Data":"25ce4a1dff9b5389e15afb76acc5a9e737daad17a6b331ac30f0a60b0dd0a16e"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.213577 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerStarted","Data":"9898ced809f4f02ded24f28135de90d0170c28170f7395759aab814117ee8368"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.219690 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq" event={"ID":"84116ead-6214-4d5f-98a3-c89b08cf1306","Type":"ContainerStarted","Data":"a9282f39595827e587c997c71e195d6fcced31850b4abd4a89e96be2134beb38"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.231670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6606596-020b-4584-b9d2-8606a794a726","Type":"ContainerStarted","Data":"514d12a47b7555c807c0fc7cc849d58d67bebe8585b02344ceacecd024355d43"} Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.251054 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.285860 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.296960 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4l528"] Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.314908 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:43:24 crc kubenswrapper[4895]: I1202 07:43:24.321381 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rn4nh"] Dec 02 07:43:24 crc kubenswrapper[4895]: W1202 07:43:24.337167 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3bcb64_db25_4f04_8624_af10542e9f10.slice/crio-bff0285bdb4d1c719faa987576935a0217bbc0dd92601edd9e2e954e3b65b35b WatchSource:0}: Error finding container bff0285bdb4d1c719faa987576935a0217bbc0dd92601edd9e2e954e3b65b35b: Status 404 returned error can't find the container with id bff0285bdb4d1c719faa987576935a0217bbc0dd92601edd9e2e954e3b65b35b Dec 02 07:43:25 crc kubenswrapper[4895]: I1202 07:43:25.153003 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6eb81c1-35cc-4330-9889-e37a864f1217" path="/var/lib/kubelet/pods/a6eb81c1-35cc-4330-9889-e37a864f1217/volumes" Dec 02 07:43:25 crc kubenswrapper[4895]: I1202 07:43:25.153781 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d3a26e-3783-4887-bd28-cd19c12a3111" path="/var/lib/kubelet/pods/f2d3a26e-3783-4887-bd28-cd19c12a3111/volumes" Dec 02 07:43:25 crc kubenswrapper[4895]: I1202 07:43:25.241229 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerStarted","Data":"bff0285bdb4d1c719faa987576935a0217bbc0dd92601edd9e2e954e3b65b35b"} Dec 02 07:43:27 crc kubenswrapper[4895]: I1202 07:43:27.263183 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerDied","Data":"7e1fc19a4fb8bbfeda2cc4b937706c5f0d0cf2fabcee3636fcd1a49acddeb02d"} Dec 02 07:43:27 crc kubenswrapper[4895]: I1202 07:43:27.263068 4895 generic.go:334] "Generic (PLEG): container finished" podID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerID="7e1fc19a4fb8bbfeda2cc4b937706c5f0d0cf2fabcee3636fcd1a49acddeb02d" exitCode=0 Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.272620 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerStarted","Data":"37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.274932 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerStarted","Data":"00f29c5ae0bb6e7bc18499f6d66bee4cc18c2d48981f4cf8697279c90c4396ff"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.276584 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq" event={"ID":"84116ead-6214-4d5f-98a3-c89b08cf1306","Type":"ContainerStarted","Data":"71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.276790 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ftfwq" Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.278676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6606596-020b-4584-b9d2-8606a794a726","Type":"ContainerStarted","Data":"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.278876 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.281988 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerStarted","Data":"8a2f68e796838e87e45698b40183c455794d730caf5af19a07c35fd150b09fe1"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.284281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerStarted","Data":"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a"} Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.298802 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ftfwq" podStartSLOduration=29.865479525 podStartE2EDuration="34.298727783s" podCreationTimestamp="2025-12-02 07:42:54 +0000 UTC" firstStartedPulling="2025-12-02 07:43:23.397560071 +0000 UTC m=+1214.568419684" lastFinishedPulling="2025-12-02 07:43:27.830808329 +0000 UTC m=+1219.001667942" observedRunningTime="2025-12-02 07:43:28.292865362 +0000 UTC m=+1219.463724975" watchObservedRunningTime="2025-12-02 07:43:28.298727783 +0000 UTC m=+1219.469587396" Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.320953 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.229589329 podStartE2EDuration="41.320922878s" podCreationTimestamp="2025-12-02 07:42:47 +0000 UTC" firstStartedPulling="2025-12-02 07:42:57.344059592 +0000 UTC m=+1188.514919205" lastFinishedPulling="2025-12-02 07:43:22.435393141 +0000 UTC m=+1213.606252754" observedRunningTime="2025-12-02 07:43:28.313593253 +0000 UTC m=+1219.484452886" watchObservedRunningTime="2025-12-02 07:43:28.320922878 +0000 UTC m=+1219.491782511" Dec 02 07:43:28 crc kubenswrapper[4895]: I1202 07:43:28.352997 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=34.052058246 podStartE2EDuration="38.352977818s" podCreationTimestamp="2025-12-02 07:42:50 +0000 UTC" firstStartedPulling="2025-12-02 07:43:23.602369474 +0000 UTC m=+1214.773229087" lastFinishedPulling="2025-12-02 07:43:27.903289046 +0000 UTC m=+1219.074148659" observedRunningTime="2025-12-02 07:43:28.349836981 +0000 UTC m=+1219.520696604" watchObservedRunningTime="2025-12-02 07:43:28.352977818 +0000 UTC m=+1219.523837431" Dec 02 07:43:29 crc kubenswrapper[4895]: I1202 07:43:29.089653 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 07:43:29 crc kubenswrapper[4895]: I1202 07:43:29.089725 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 07:43:29 crc kubenswrapper[4895]: I1202 07:43:29.297407 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerID="0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a" exitCode=0 Dec 02 07:43:29 crc kubenswrapper[4895]: I1202 07:43:29.297603 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerDied","Data":"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a"} Dec 02 07:43:30 crc kubenswrapper[4895]: I1202 07:43:30.311352 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerStarted","Data":"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff"} Dec 02 07:43:30 crc kubenswrapper[4895]: I1202 07:43:30.311808 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerStarted","Data":"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd"} Dec 02 07:43:30 crc kubenswrapper[4895]: I1202 07:43:30.311836 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:43:30 crc kubenswrapper[4895]: I1202 07:43:30.340752 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9vczq" podStartSLOduration=32.122462875 podStartE2EDuration="36.340713327s" podCreationTimestamp="2025-12-02 07:42:54 +0000 UTC" firstStartedPulling="2025-12-02 07:43:23.611695611 +0000 UTC m=+1214.782555224" lastFinishedPulling="2025-12-02 07:43:27.829946073 +0000 UTC m=+1219.000805676" observedRunningTime="2025-12-02 07:43:30.333470543 +0000 UTC m=+1221.504330196" watchObservedRunningTime="2025-12-02 07:43:30.340713327 +0000 UTC m=+1221.511572940" Dec 02 07:43:31 crc kubenswrapper[4895]: I1202 07:43:31.320120 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.339425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerStarted","Data":"70b9062cebf51e2cb33bacac0c59956df85ea562e2a492f0a1c5926a2d8af62e"} Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.342938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerStarted","Data":"702d499c8eb77b3784f109189f3605813a2864341fb22907bb8c80622cf297f0"} Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.345954 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerStarted","Data":"f92e86e4ef56e11c6550ddfd03d9e3a46bb2f030d0256069562686b8ad550a7f"} Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.409590 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.112647876 podStartE2EDuration="40.409551789s" podCreationTimestamp="2025-12-02 07:42:53 +0000 UTC" firstStartedPulling="2025-12-02 07:43:23.611145545 +0000 UTC m=+1214.782005158" lastFinishedPulling="2025-12-02 07:43:32.908049458 +0000 UTC m=+1224.078909071" observedRunningTime="2025-12-02 07:43:33.400460938 +0000 UTC m=+1224.571320561" watchObservedRunningTime="2025-12-02 07:43:33.409551789 +0000 UTC m=+1224.580411442" Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.430188 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.875502873 podStartE2EDuration="37.430163364s" podCreationTimestamp="2025-12-02 07:42:56 +0000 UTC" firstStartedPulling="2025-12-02 07:43:24.339967302 +0000 UTC m=+1215.510826915" lastFinishedPulling="2025-12-02 07:43:32.894627793 +0000 UTC m=+1224.065487406" observedRunningTime="2025-12-02 07:43:33.421486966 +0000 UTC m=+1224.592346589" watchObservedRunningTime="2025-12-02 07:43:33.430163364 +0000 UTC m=+1224.601022987" Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.890057 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 07:43:33 crc kubenswrapper[4895]: I1202 07:43:33.941445 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.331466 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.359851 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerStarted","Data":"39340c4fd973c571bd458064ab8a8ad372022cf6e584357ba7e6b31eaf6221a0"} Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.360275 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.387536 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.421265 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.700865 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.738666 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.740297 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.743292 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.751142 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.753128 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.755146 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.760315 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.796362 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.835987 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836464 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836505 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxlq\" (UniqueName: \"kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836539 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836590 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836668 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836712 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh892\" (UniqueName: \"kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836820 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.836858 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.939971 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940086 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh892\" (UniqueName: \"kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940210 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxlq\" (UniqueName: \"kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.940317 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.941395 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.941937 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.943296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.943330 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.943395 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.946726 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.947605 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:34 crc kubenswrapper[4895]: I1202 07:43:34.948110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.034442 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.052687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh892\" (UniqueName: \"kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892\") pod \"dnsmasq-dns-7fd796d7df-5shlf\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.060214 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxlq\" (UniqueName: \"kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq\") pod \"ovn-controller-metrics-qvrkm\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.098076 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.099507 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.106256 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.119552 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.189261 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.226262 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.230883 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.246770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.247177 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.250036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.250187 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.251111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8zf\" (UniqueName: \"kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.319942 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.356823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.356893 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.356998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8zf\" (UniqueName: \"kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.357059 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.357087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.358302 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.358298 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.358637 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.358655 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.388750 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" containerID="6c4a648fff1b29e5e4451f6d2341ba8e5b83a1dbc238ebcd6dd0e9263ea4f751" exitCode=0 Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.389989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" event={"ID":"3b019ba5-e2aa-4ac7-a3a6-02e619195935","Type":"ContainerDied","Data":"6c4a648fff1b29e5e4451f6d2341ba8e5b83a1dbc238ebcd6dd0e9263ea4f751"} Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.390426 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8zf\" (UniqueName: \"kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf\") pod \"dnsmasq-dns-86db49b7ff-wlqps\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.391589 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.480489 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.492255 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.495402 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.615485 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config\") pod \"56de784f-037a-486b-8e7b-51d62e0dd0ad\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.615531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc\") pod \"56de784f-037a-486b-8e7b-51d62e0dd0ad\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.615561 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w\") pod \"56de784f-037a-486b-8e7b-51d62e0dd0ad\" (UID: \"56de784f-037a-486b-8e7b-51d62e0dd0ad\") " Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.616722 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config" (OuterVolumeSpecName: "config") pod "56de784f-037a-486b-8e7b-51d62e0dd0ad" (UID: "56de784f-037a-486b-8e7b-51d62e0dd0ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.617013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56de784f-037a-486b-8e7b-51d62e0dd0ad" (UID: "56de784f-037a-486b-8e7b-51d62e0dd0ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.717230 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.717305 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56de784f-037a-486b-8e7b-51d62e0dd0ad-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.782431 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w" (OuterVolumeSpecName: "kube-api-access-kcj6w") pod "56de784f-037a-486b-8e7b-51d62e0dd0ad" (UID: "56de784f-037a-486b-8e7b-51d62e0dd0ad"). InnerVolumeSpecName "kube-api-access-kcj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.790292 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.806152 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.814800 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.815048 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.815286 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h6mt5" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.818898 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.818966 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.818997 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.819021 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.819051 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.819074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.819095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8ww\" (UniqueName: \"kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.819151 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcj6w\" (UniqueName: \"kubernetes.io/projected/56de784f-037a-486b-8e7b-51d62e0dd0ad-kube-api-access-kcj6w\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.834920 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.858842 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921076 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921159 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921196 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921224 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921261 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921287 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.921311 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8ww\" (UniqueName: \"kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.929401 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.929755 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.930621 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.931145 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.932252 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.932942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.935720 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.952182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8ww\" (UniqueName: \"kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww\") pod \"ovn-northd-0\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " pod="openstack/ovn-northd-0" Dec 02 07:43:35 crc kubenswrapper[4895]: I1202 07:43:35.985594 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.118776 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.197179 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.226912 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wklbl\" (UniqueName: \"kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl\") pod \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.227015 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config\") pod \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.227211 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc\") pod \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\" (UID: \"3b019ba5-e2aa-4ac7-a3a6-02e619195935\") " Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.254318 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl" (OuterVolumeSpecName: "kube-api-access-wklbl") pod "3b019ba5-e2aa-4ac7-a3a6-02e619195935" (UID: "3b019ba5-e2aa-4ac7-a3a6-02e619195935"). InnerVolumeSpecName "kube-api-access-wklbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.321602 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:43:36 crc kubenswrapper[4895]: W1202 07:43:36.322551 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37be9e7_783d_4bd7_a9fc_41049311cab8.slice/crio-4093ffd564c25627cdd93db6f07a5a23db54120b07ad150be9d272b8596f3dad WatchSource:0}: Error finding container 4093ffd564c25627cdd93db6f07a5a23db54120b07ad150be9d272b8596f3dad: Status 404 returned error can't find the container with id 4093ffd564c25627cdd93db6f07a5a23db54120b07ad150be9d272b8596f3dad Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.330762 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wklbl\" (UniqueName: \"kubernetes.io/projected/3b019ba5-e2aa-4ac7-a3a6-02e619195935-kube-api-access-wklbl\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.397324 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" event={"ID":"dee0d598-f287-4a9d-975a-2dba3c9eb93e","Type":"ContainerStarted","Data":"62f3391fba86350386c08298839fd092a4931f8782f164ee8285be0a7dd38e00"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.398242 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.398232 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l7vgw" event={"ID":"56de784f-037a-486b-8e7b-51d62e0dd0ad","Type":"ContainerDied","Data":"c845940080c320a0eb0dfbc9a27798ca2ffcd7bdc550cdbbfff25f82359a77ee"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.398897 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" event={"ID":"b37be9e7-783d-4bd7-a9fc-41049311cab8","Type":"ContainerStarted","Data":"4093ffd564c25627cdd93db6f07a5a23db54120b07ad150be9d272b8596f3dad"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.399821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qvrkm" event={"ID":"8206622d-b224-4744-9358-ad7c10d98ca1","Type":"ContainerStarted","Data":"06ff8339033df96a83077a62e929bc5d1df2839a17c03cd0e1008d9758abd8ec"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.399859 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qvrkm" event={"ID":"8206622d-b224-4744-9358-ad7c10d98ca1","Type":"ContainerStarted","Data":"fa59c78f12ab59582ba18ac98344d7edfd8ba52bfd4fc14f8ee735e6d81c9907"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.403038 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.403005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lvv8z" event={"ID":"3b019ba5-e2aa-4ac7-a3a6-02e619195935","Type":"ContainerDied","Data":"d67570f5f95063dfacda7f2e25feeef193c2f24ad14f5df53e751ec87d66a9e2"} Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.403140 4895 scope.go:117] "RemoveContainer" containerID="6c4a648fff1b29e5e4451f6d2341ba8e5b83a1dbc238ebcd6dd0e9263ea4f751" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.454225 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-qvrkm" podStartSLOduration=2.454194142 podStartE2EDuration="2.454194142s" podCreationTimestamp="2025-12-02 07:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:36.438647093 +0000 UTC m=+1227.609506746" watchObservedRunningTime="2025-12-02 07:43:36.454194142 +0000 UTC m=+1227.625053755" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.591209 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.598249 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l7vgw"] Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.647364 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b019ba5-e2aa-4ac7-a3a6-02e619195935" (UID: "3b019ba5-e2aa-4ac7-a3a6-02e619195935"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.648473 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.652410 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config" (OuterVolumeSpecName: "config") pod "3b019ba5-e2aa-4ac7-a3a6-02e619195935" (UID: "3b019ba5-e2aa-4ac7-a3a6-02e619195935"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.750078 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b019ba5-e2aa-4ac7-a3a6-02e619195935-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.802836 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.810648 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lvv8z"] Dec 02 07:43:36 crc kubenswrapper[4895]: I1202 07:43:36.866750 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.156209 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" path="/var/lib/kubelet/pods/3b019ba5-e2aa-4ac7-a3a6-02e619195935/volumes" Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.157074 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56de784f-037a-486b-8e7b-51d62e0dd0ad" path="/var/lib/kubelet/pods/56de784f-037a-486b-8e7b-51d62e0dd0ad/volumes" Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.412684 4895 generic.go:334] "Generic (PLEG): container finished" podID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerID="9e17a81ab311b0c42a6251dd35556fa133111b3838b913a3426967b030929e27" exitCode=0 Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.412828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" event={"ID":"b37be9e7-783d-4bd7-a9fc-41049311cab8","Type":"ContainerDied","Data":"9e17a81ab311b0c42a6251dd35556fa133111b3838b913a3426967b030929e27"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.415227 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerStarted","Data":"dae6ee95ef6df69cc075b37be3c7109ee4cab3f60c4bdf61b7793f530ffc9ab5"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.418245 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerStarted","Data":"fcab07b3a6e3e24e623ff43fafd5d6c39c2f1c8eea31976b8d55fe2707cec11a"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.420072 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b15097a8-ac9a-4886-a839-272b662561c5","Type":"ContainerStarted","Data":"749c0f6ea01ac411d0209d4472b7bd79cfc38bd8f584ebdd6968b35f5d12cdc7"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.420466 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.423021 4895 generic.go:334] "Generic (PLEG): container finished" podID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerID="36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6" exitCode=0 Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.423127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" event={"ID":"dee0d598-f287-4a9d-975a-2dba3c9eb93e","Type":"ContainerDied","Data":"36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.427884 4895 generic.go:334] "Generic (PLEG): container finished" podID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerID="70b9062cebf51e2cb33bacac0c59956df85ea562e2a492f0a1c5926a2d8af62e" exitCode=0 Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.428967 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerDied","Data":"70b9062cebf51e2cb33bacac0c59956df85ea562e2a492f0a1c5926a2d8af62e"} Dec 02 07:43:37 crc kubenswrapper[4895]: I1202 07:43:37.494796 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.072470917 podStartE2EDuration="49.494771853s" podCreationTimestamp="2025-12-02 07:42:48 +0000 UTC" firstStartedPulling="2025-12-02 07:42:57.337303604 +0000 UTC m=+1188.508163217" lastFinishedPulling="2025-12-02 07:43:36.75960454 +0000 UTC m=+1227.930464153" observedRunningTime="2025-12-02 07:43:37.493863016 +0000 UTC m=+1228.664722649" watchObservedRunningTime="2025-12-02 07:43:37.494771853 +0000 UTC m=+1228.665631466" Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.446754 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" event={"ID":"b37be9e7-783d-4bd7-a9fc-41049311cab8","Type":"ContainerStarted","Data":"5ef88bfacba00b39e539f67d36ac63bdb6060f919e164fdac86607561083daf4"} Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.448603 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.449683 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerStarted","Data":"9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58"} Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.452425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" event={"ID":"dee0d598-f287-4a9d-975a-2dba3c9eb93e","Type":"ContainerStarted","Data":"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0"} Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.453322 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.455896 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerStarted","Data":"6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704"} Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.472095 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" podStartSLOduration=3.47207067 podStartE2EDuration="3.47207067s" podCreationTimestamp="2025-12-02 07:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:38.464169117 +0000 UTC m=+1229.635028750" watchObservedRunningTime="2025-12-02 07:43:38.47207067 +0000 UTC m=+1229.642930283" Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.482833 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371983.371967 podStartE2EDuration="53.482807842s" podCreationTimestamp="2025-12-02 07:42:45 +0000 UTC" firstStartedPulling="2025-12-02 07:42:49.207236328 +0000 UTC m=+1180.378095941" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:38.47980807 +0000 UTC m=+1229.650667713" watchObservedRunningTime="2025-12-02 07:43:38.482807842 +0000 UTC m=+1229.653667485" Dec 02 07:43:38 crc kubenswrapper[4895]: I1202 07:43:38.503830 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" podStartSLOduration=4.503808859 podStartE2EDuration="4.503808859s" podCreationTimestamp="2025-12-02 07:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:38.497846695 +0000 UTC m=+1229.668706308" watchObservedRunningTime="2025-12-02 07:43:38.503808859 +0000 UTC m=+1229.674668492" Dec 02 07:43:39 crc kubenswrapper[4895]: I1202 07:43:39.466095 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerStarted","Data":"8f7f80f7975fea79b1c3bcefa5a8a41052d690e193ab88673538d60ad2720b9a"} Dec 02 07:43:39 crc kubenswrapper[4895]: I1202 07:43:39.488837 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.315117153 podStartE2EDuration="4.488811645s" podCreationTimestamp="2025-12-02 07:43:35 +0000 UTC" firstStartedPulling="2025-12-02 07:43:36.876999774 +0000 UTC m=+1228.047859387" lastFinishedPulling="2025-12-02 07:43:38.050694266 +0000 UTC m=+1229.221553879" observedRunningTime="2025-12-02 07:43:39.487425853 +0000 UTC m=+1230.658285476" watchObservedRunningTime="2025-12-02 07:43:39.488811645 +0000 UTC m=+1230.659671268" Dec 02 07:43:40 crc kubenswrapper[4895]: I1202 07:43:40.476954 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 07:43:40 crc kubenswrapper[4895]: I1202 07:43:40.921037 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 07:43:43 crc kubenswrapper[4895]: I1202 07:43:43.934065 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 07:43:45 crc kubenswrapper[4895]: I1202 07:43:45.232886 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:45 crc kubenswrapper[4895]: I1202 07:43:45.481930 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:43:45 crc kubenswrapper[4895]: I1202 07:43:45.548790 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:45 crc kubenswrapper[4895]: I1202 07:43:45.549338 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="dnsmasq-dns" containerID="cri-o://03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0" gracePeriod=10 Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.030491 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.114843 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc\") pod \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.114899 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb\") pod \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.114930 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config\") pod \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.115026 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh892\" (UniqueName: \"kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892\") pod \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\" (UID: \"dee0d598-f287-4a9d-975a-2dba3c9eb93e\") " Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.122041 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892" (OuterVolumeSpecName: "kube-api-access-kh892") pod "dee0d598-f287-4a9d-975a-2dba3c9eb93e" (UID: "dee0d598-f287-4a9d-975a-2dba3c9eb93e"). InnerVolumeSpecName "kube-api-access-kh892". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.155247 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dee0d598-f287-4a9d-975a-2dba3c9eb93e" (UID: "dee0d598-f287-4a9d-975a-2dba3c9eb93e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.158437 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dee0d598-f287-4a9d-975a-2dba3c9eb93e" (UID: "dee0d598-f287-4a9d-975a-2dba3c9eb93e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.159850 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config" (OuterVolumeSpecName: "config") pod "dee0d598-f287-4a9d-975a-2dba3c9eb93e" (UID: "dee0d598-f287-4a9d-975a-2dba3c9eb93e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.217043 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh892\" (UniqueName: \"kubernetes.io/projected/dee0d598-f287-4a9d-975a-2dba3c9eb93e-kube-api-access-kh892\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.217080 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.217092 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.217102 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dee0d598-f287-4a9d-975a-2dba3c9eb93e-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.530725 4895 generic.go:334] "Generic (PLEG): container finished" podID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerID="03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0" exitCode=0 Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.530805 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" event={"ID":"dee0d598-f287-4a9d-975a-2dba3c9eb93e","Type":"ContainerDied","Data":"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0"} Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.530844 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" event={"ID":"dee0d598-f287-4a9d-975a-2dba3c9eb93e","Type":"ContainerDied","Data":"62f3391fba86350386c08298839fd092a4931f8782f164ee8285be0a7dd38e00"} Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.530869 4895 scope.go:117] "RemoveContainer" containerID="03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.531044 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-5shlf" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.569200 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.572864 4895 scope.go:117] "RemoveContainer" containerID="36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.575508 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-5shlf"] Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.603101 4895 scope.go:117] "RemoveContainer" containerID="03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0" Dec 02 07:43:46 crc kubenswrapper[4895]: E1202 07:43:46.603787 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0\": container with ID starting with 03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0 not found: ID does not exist" containerID="03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.603863 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0"} err="failed to get container status \"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0\": rpc error: code = NotFound desc = could not find container \"03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0\": container with ID starting with 03a184a39ee7dbdf13bd5a5d26a13d43fbbd3bbcf451b7a6eb6d49273096afa0 not found: ID does not exist" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.603912 4895 scope.go:117] "RemoveContainer" containerID="36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6" Dec 02 07:43:46 crc kubenswrapper[4895]: E1202 07:43:46.604452 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6\": container with ID starting with 36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6 not found: ID does not exist" containerID="36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6" Dec 02 07:43:46 crc kubenswrapper[4895]: I1202 07:43:46.604506 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6"} err="failed to get container status \"36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6\": rpc error: code = NotFound desc = could not find container \"36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6\": container with ID starting with 36b12cd5a2bc9c1ba5e1a0c823f9ed877bb778a55aac15de791fc4fa102713d6 not found: ID does not exist" Dec 02 07:43:47 crc kubenswrapper[4895]: I1202 07:43:47.153427 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" path="/var/lib/kubelet/pods/dee0d598-f287-4a9d-975a-2dba3c9eb93e/volumes" Dec 02 07:43:47 crc kubenswrapper[4895]: I1202 07:43:47.264495 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 07:43:47 crc kubenswrapper[4895]: I1202 07:43:47.264622 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 07:43:47 crc kubenswrapper[4895]: I1202 07:43:47.365931 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 07:43:47 crc kubenswrapper[4895]: I1202 07:43:47.614100 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.576055 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hzt8n"] Dec 02 07:43:48 crc kubenswrapper[4895]: E1202 07:43:48.576874 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" containerName="init" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.576896 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" containerName="init" Dec 02 07:43:48 crc kubenswrapper[4895]: E1202 07:43:48.576913 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="dnsmasq-dns" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.576922 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="dnsmasq-dns" Dec 02 07:43:48 crc kubenswrapper[4895]: E1202 07:43:48.576935 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="init" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.576944 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="init" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.577159 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee0d598-f287-4a9d-975a-2dba3c9eb93e" containerName="dnsmasq-dns" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.577191 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b019ba5-e2aa-4ac7-a3a6-02e619195935" containerName="init" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.577943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.586384 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hzt8n"] Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.707765 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a3d7-account-create-update-dmv94"] Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.708973 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.711803 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.720035 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a3d7-account-create-update-dmv94"] Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.761720 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.761823 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clg6\" (UniqueName: \"kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.807295 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mvm85"] Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.808971 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mvm85" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.827761 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mvm85"] Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.864260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.864323 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.864375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8clg6\" (UniqueName: \"kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.864438 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpmd\" (UniqueName: \"kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.865452 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.888845 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8clg6\" (UniqueName: \"kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6\") pod \"keystone-db-create-hzt8n\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.903824 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.966525 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgln6\" (UniqueName: \"kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.966632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.966675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.966728 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpmd\" (UniqueName: \"kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.967760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:48 crc kubenswrapper[4895]: I1202 07:43:48.994165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpmd\" (UniqueName: \"kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd\") pod \"keystone-a3d7-account-create-update-dmv94\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.007411 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a8bc-account-create-update-8tvzs"] Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.008539 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.011155 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.023163 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.037232 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a8bc-account-create-update-8tvzs"] Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.068055 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgln6\" (UniqueName: \"kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.068143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.069039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.096348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgln6\" (UniqueName: \"kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6\") pod \"placement-db-create-mvm85\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " pod="openstack/placement-db-create-mvm85" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.130403 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mvm85" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.174091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.174211 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxz8t\" (UniqueName: \"kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.276057 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.276128 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxz8t\" (UniqueName: \"kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.277358 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.298197 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxz8t\" (UniqueName: \"kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t\") pod \"placement-a8bc-account-create-update-8tvzs\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.410309 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hzt8n"] Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.417531 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:49 crc kubenswrapper[4895]: W1202 07:43:49.421353 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeec7076a_ba39_484b_9c7f_4eb78d449de2.slice/crio-cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb WatchSource:0}: Error finding container cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb: Status 404 returned error can't find the container with id cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.568310 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hzt8n" event={"ID":"eec7076a-ba39-484b-9c7f-4eb78d449de2","Type":"ContainerStarted","Data":"cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb"} Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.621956 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a3d7-account-create-update-dmv94"] Dec 02 07:43:49 crc kubenswrapper[4895]: W1202 07:43:49.630958 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod936e9805_90f9_43dd_ad0c_f248ea86a3c5.slice/crio-49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3 WatchSource:0}: Error finding container 49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3: Status 404 returned error can't find the container with id 49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3 Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.679098 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mvm85"] Dec 02 07:43:49 crc kubenswrapper[4895]: I1202 07:43:49.856280 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a8bc-account-create-update-8tvzs"] Dec 02 07:43:49 crc kubenswrapper[4895]: W1202 07:43:49.857525 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ddddc35_bd7e_4d40_804f_aa2193b6cd16.slice/crio-a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7 WatchSource:0}: Error finding container a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7: Status 404 returned error can't find the container with id a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7 Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.579121 4895 generic.go:334] "Generic (PLEG): container finished" podID="936e9805-90f9-43dd-ad0c-f248ea86a3c5" containerID="28f9cb6d02e60c3a6d26b50a6fa46604e2e69011e552700cbb792dbf252b2632" exitCode=0 Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.579223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a3d7-account-create-update-dmv94" event={"ID":"936e9805-90f9-43dd-ad0c-f248ea86a3c5","Type":"ContainerDied","Data":"28f9cb6d02e60c3a6d26b50a6fa46604e2e69011e552700cbb792dbf252b2632"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.580218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a3d7-account-create-update-dmv94" event={"ID":"936e9805-90f9-43dd-ad0c-f248ea86a3c5","Type":"ContainerStarted","Data":"49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.581888 4895 generic.go:334] "Generic (PLEG): container finished" podID="7ddddc35-bd7e-4d40-804f-aa2193b6cd16" containerID="4fdd958fc1822c12a2d4aca9d8bd5fd877dcad2ca93c61ee85e2640247da17f0" exitCode=0 Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.581944 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a8bc-account-create-update-8tvzs" event={"ID":"7ddddc35-bd7e-4d40-804f-aa2193b6cd16","Type":"ContainerDied","Data":"4fdd958fc1822c12a2d4aca9d8bd5fd877dcad2ca93c61ee85e2640247da17f0"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.582139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a8bc-account-create-update-8tvzs" event={"ID":"7ddddc35-bd7e-4d40-804f-aa2193b6cd16","Type":"ContainerStarted","Data":"a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.584944 4895 generic.go:334] "Generic (PLEG): container finished" podID="eec7076a-ba39-484b-9c7f-4eb78d449de2" containerID="d8cc3e500cc7cf167ba6655926e2bd0f9e523259b1d217e4a231d4180d10b525" exitCode=0 Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.585077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hzt8n" event={"ID":"eec7076a-ba39-484b-9c7f-4eb78d449de2","Type":"ContainerDied","Data":"d8cc3e500cc7cf167ba6655926e2bd0f9e523259b1d217e4a231d4180d10b525"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.587023 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a98f117-14fc-47c5-9106-c9a3daf161f8" containerID="9e4586f8b3fb6ca58d5504dd173c8353883757566be74d1cb6c65e2158e6f973" exitCode=0 Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.587074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mvm85" event={"ID":"9a98f117-14fc-47c5-9106-c9a3daf161f8","Type":"ContainerDied","Data":"9e4586f8b3fb6ca58d5504dd173c8353883757566be74d1cb6c65e2158e6f973"} Dec 02 07:43:50 crc kubenswrapper[4895]: I1202 07:43:50.587104 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mvm85" event={"ID":"9a98f117-14fc-47c5-9106-c9a3daf161f8","Type":"ContainerStarted","Data":"2791b769907b605fc8b38e7a713f1aec5513929fdeebb580056dc1b362703f9d"} Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.153860 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.156010 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.164695 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.300351 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.319867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.319942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.320025 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.320079 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.320111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz56z\" (UniqueName: \"kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.421985 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.422067 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.422176 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.422208 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz56z\" (UniqueName: \"kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.422301 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.423659 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.423944 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.423990 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.424275 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.449795 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz56z\" (UniqueName: \"kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z\") pod \"dnsmasq-dns-698758b865-lsqg6\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.480111 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:51 crc kubenswrapper[4895]: I1202 07:43:51.932108 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.108020 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.173224 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mvm85" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.184408 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.187786 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.242567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8clg6\" (UniqueName: \"kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6\") pod \"eec7076a-ba39-484b-9c7f-4eb78d449de2\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.242713 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts\") pod \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.242884 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhpmd\" (UniqueName: \"kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd\") pod \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\" (UID: \"936e9805-90f9-43dd-ad0c-f248ea86a3c5\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.242918 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts\") pod \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.242958 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxz8t\" (UniqueName: \"kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t\") pod \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\" (UID: \"7ddddc35-bd7e-4d40-804f-aa2193b6cd16\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.243013 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgln6\" (UniqueName: \"kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6\") pod \"9a98f117-14fc-47c5-9106-c9a3daf161f8\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.243086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts\") pod \"eec7076a-ba39-484b-9c7f-4eb78d449de2\" (UID: \"eec7076a-ba39-484b-9c7f-4eb78d449de2\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.243133 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts\") pod \"9a98f117-14fc-47c5-9106-c9a3daf161f8\" (UID: \"9a98f117-14fc-47c5-9106-c9a3daf161f8\") " Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.245200 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ddddc35-bd7e-4d40-804f-aa2193b6cd16" (UID: "7ddddc35-bd7e-4d40-804f-aa2193b6cd16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.245633 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a98f117-14fc-47c5-9106-c9a3daf161f8" (UID: "9a98f117-14fc-47c5-9106-c9a3daf161f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.246978 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eec7076a-ba39-484b-9c7f-4eb78d449de2" (UID: "eec7076a-ba39-484b-9c7f-4eb78d449de2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.247523 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "936e9805-90f9-43dd-ad0c-f248ea86a3c5" (UID: "936e9805-90f9-43dd-ad0c-f248ea86a3c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.250199 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6" (OuterVolumeSpecName: "kube-api-access-8clg6") pod "eec7076a-ba39-484b-9c7f-4eb78d449de2" (UID: "eec7076a-ba39-484b-9c7f-4eb78d449de2"). InnerVolumeSpecName "kube-api-access-8clg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.252517 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t" (OuterVolumeSpecName: "kube-api-access-qxz8t") pod "7ddddc35-bd7e-4d40-804f-aa2193b6cd16" (UID: "7ddddc35-bd7e-4d40-804f-aa2193b6cd16"). InnerVolumeSpecName "kube-api-access-qxz8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.254152 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd" (OuterVolumeSpecName: "kube-api-access-lhpmd") pod "936e9805-90f9-43dd-ad0c-f248ea86a3c5" (UID: "936e9805-90f9-43dd-ad0c-f248ea86a3c5"). InnerVolumeSpecName "kube-api-access-lhpmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.273278 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6" (OuterVolumeSpecName: "kube-api-access-qgln6") pod "9a98f117-14fc-47c5-9106-c9a3daf161f8" (UID: "9a98f117-14fc-47c5-9106-c9a3daf161f8"). InnerVolumeSpecName "kube-api-access-qgln6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345648 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345695 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxz8t\" (UniqueName: \"kubernetes.io/projected/7ddddc35-bd7e-4d40-804f-aa2193b6cd16-kube-api-access-qxz8t\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345713 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgln6\" (UniqueName: \"kubernetes.io/projected/9a98f117-14fc-47c5-9106-c9a3daf161f8-kube-api-access-qgln6\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345728 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec7076a-ba39-484b-9c7f-4eb78d449de2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345800 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a98f117-14fc-47c5-9106-c9a3daf161f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345812 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8clg6\" (UniqueName: \"kubernetes.io/projected/eec7076a-ba39-484b-9c7f-4eb78d449de2-kube-api-access-8clg6\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345821 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/936e9805-90f9-43dd-ad0c-f248ea86a3c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.345830 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhpmd\" (UniqueName: \"kubernetes.io/projected/936e9805-90f9-43dd-ad0c-f248ea86a3c5-kube-api-access-lhpmd\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.421455 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.421968 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddddc35-bd7e-4d40-804f-aa2193b6cd16" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.421999 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddddc35-bd7e-4d40-804f-aa2193b6cd16" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.422030 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec7076a-ba39-484b-9c7f-4eb78d449de2" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422040 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec7076a-ba39-484b-9c7f-4eb78d449de2" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.422061 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a98f117-14fc-47c5-9106-c9a3daf161f8" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422067 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a98f117-14fc-47c5-9106-c9a3daf161f8" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.422080 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="936e9805-90f9-43dd-ad0c-f248ea86a3c5" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422086 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="936e9805-90f9-43dd-ad0c-f248ea86a3c5" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422242 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec7076a-ba39-484b-9c7f-4eb78d449de2" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422255 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a98f117-14fc-47c5-9106-c9a3daf161f8" containerName="mariadb-database-create" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422261 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="936e9805-90f9-43dd-ad0c-f248ea86a3c5" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.422272 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddddc35-bd7e-4d40-804f-aa2193b6cd16" containerName="mariadb-account-create-update" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.428584 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.441249 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.477673 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-wzz46" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.478366 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.478308 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.478856 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.581787 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.581879 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.581919 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.581978 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7qf\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.582005 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.609062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a8bc-account-create-update-8tvzs" event={"ID":"7ddddc35-bd7e-4d40-804f-aa2193b6cd16","Type":"ContainerDied","Data":"a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.609169 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b29e8ae0b4a9febf840721cbcc7ed548d57966539b6354c9702b941c2f32e7" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.609088 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a8bc-account-create-update-8tvzs" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.610953 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hzt8n" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.611005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hzt8n" event={"ID":"eec7076a-ba39-484b-9c7f-4eb78d449de2","Type":"ContainerDied","Data":"cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.611380 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc9c910f050c9aac6dc9dcee11cdb301ef7f94e1bd74c3bbd1cb923ebfb292eb" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.612894 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mvm85" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.612889 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mvm85" event={"ID":"9a98f117-14fc-47c5-9106-c9a3daf161f8","Type":"ContainerDied","Data":"2791b769907b605fc8b38e7a713f1aec5513929fdeebb580056dc1b362703f9d"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.613063 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2791b769907b605fc8b38e7a713f1aec5513929fdeebb580056dc1b362703f9d" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.614507 4895 generic.go:334] "Generic (PLEG): container finished" podID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerID="3d5c5568581556187481cff77ef82cfda6eb170ba06edbc7ef83e95471ce7d63" exitCode=0 Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.614768 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-lsqg6" event={"ID":"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50","Type":"ContainerDied","Data":"3d5c5568581556187481cff77ef82cfda6eb170ba06edbc7ef83e95471ce7d63"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.614824 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-lsqg6" event={"ID":"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50","Type":"ContainerStarted","Data":"29e0b10a742c61c1b44cf3e5b72aa94dd4c52460f24c3e4d2e74597698eead5c"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.616281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a3d7-account-create-update-dmv94" event={"ID":"936e9805-90f9-43dd-ad0c-f248ea86a3c5","Type":"ContainerDied","Data":"49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3"} Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.616399 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d1b8d442ef0fb22ca482cce50b8e6ede9733d5f0bf39f6a0ef6e7ec320ebe3" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.616353 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a3d7-account-create-update-dmv94" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.684496 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7qf\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.684975 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.685027 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.685096 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.685151 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.686174 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.686814 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.686844 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:43:52 crc kubenswrapper[4895]: E1202 07:43:52.686912 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:43:53.186890107 +0000 UTC m=+1244.357749720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.688011 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.688447 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.703360 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7qf\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:52 crc kubenswrapper[4895]: I1202 07:43:52.719108 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:53 crc kubenswrapper[4895]: I1202 07:43:53.201825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:53 crc kubenswrapper[4895]: E1202 07:43:53.202088 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:43:53 crc kubenswrapper[4895]: E1202 07:43:53.202122 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:43:53 crc kubenswrapper[4895]: E1202 07:43:53.202206 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:43:54.202177568 +0000 UTC m=+1245.373037181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:43:53 crc kubenswrapper[4895]: I1202 07:43:53.629857 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-lsqg6" event={"ID":"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50","Type":"ContainerStarted","Data":"bfa758772dddbb174854624875a511c8200e31a9842ba6666340092013875a32"} Dec 02 07:43:53 crc kubenswrapper[4895]: I1202 07:43:53.630047 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:43:53 crc kubenswrapper[4895]: I1202 07:43:53.657204 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-lsqg6" podStartSLOduration=2.657179449 podStartE2EDuration="2.657179449s" podCreationTimestamp="2025-12-02 07:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:53.650108901 +0000 UTC m=+1244.820968534" watchObservedRunningTime="2025-12-02 07:43:53.657179449 +0000 UTC m=+1244.828039062" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.145997 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nbxdg"] Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.147638 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.156364 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nbxdg"] Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.222691 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.222863 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqj7\" (UniqueName: \"kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.223011 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: E1202 07:43:54.223092 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:43:54 crc kubenswrapper[4895]: E1202 07:43:54.223152 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:43:54 crc kubenswrapper[4895]: E1202 07:43:54.223239 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:43:56.223207115 +0000 UTC m=+1247.394066728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.254000 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5583-account-create-update-vwhn6"] Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.255363 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.266117 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.273433 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5583-account-create-update-vwhn6"] Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.325070 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.325152 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xjj\" (UniqueName: \"kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.325205 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqj7\" (UniqueName: \"kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.325430 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.326734 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.349025 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqj7\" (UniqueName: \"kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7\") pod \"glance-db-create-nbxdg\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.427539 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.427630 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xjj\" (UniqueName: \"kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.428400 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.452760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xjj\" (UniqueName: \"kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj\") pod \"glance-5583-account-create-update-vwhn6\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.494365 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.578410 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:54 crc kubenswrapper[4895]: I1202 07:43:54.999168 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nbxdg"] Dec 02 07:43:55 crc kubenswrapper[4895]: W1202 07:43:55.003323 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad4346a5_f2f0_4809_b1d3_0c9a70b51cbd.slice/crio-930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528 WatchSource:0}: Error finding container 930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528: Status 404 returned error can't find the container with id 930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528 Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.106660 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5583-account-create-update-vwhn6"] Dec 02 07:43:55 crc kubenswrapper[4895]: W1202 07:43:55.109722 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f38520b_350c_4c3c_9bd2_48bf3c492299.slice/crio-d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc WatchSource:0}: Error finding container d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc: Status 404 returned error can't find the container with id d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.660455 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbxdg" event={"ID":"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd","Type":"ContainerStarted","Data":"15c0a8a60d6e51b79c5a48224057195f23217f2a902e4569436fc6187c88a4ee"} Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.660511 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbxdg" event={"ID":"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd","Type":"ContainerStarted","Data":"930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528"} Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.667043 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5583-account-create-update-vwhn6" event={"ID":"9f38520b-350c-4c3c-9bd2-48bf3c492299","Type":"ContainerStarted","Data":"7ae3e5d5ec8de27bf0dd2f2d640b40e84c4d95a0e1317e0fee0317f8b1e9f187"} Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.667096 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5583-account-create-update-vwhn6" event={"ID":"9f38520b-350c-4c3c-9bd2-48bf3c492299","Type":"ContainerStarted","Data":"d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc"} Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.677758 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-nbxdg" podStartSLOduration=1.677720171 podStartE2EDuration="1.677720171s" podCreationTimestamp="2025-12-02 07:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:55.676359968 +0000 UTC m=+1246.847219601" watchObservedRunningTime="2025-12-02 07:43:55.677720171 +0000 UTC m=+1246.848579784" Dec 02 07:43:55 crc kubenswrapper[4895]: I1202 07:43:55.702822 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-5583-account-create-update-vwhn6" podStartSLOduration=1.702796974 podStartE2EDuration="1.702796974s" podCreationTimestamp="2025-12-02 07:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:43:55.694415775 +0000 UTC m=+1246.865275398" watchObservedRunningTime="2025-12-02 07:43:55.702796974 +0000 UTC m=+1246.873656607" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.272609 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:43:56 crc kubenswrapper[4895]: E1202 07:43:56.273452 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:43:56 crc kubenswrapper[4895]: E1202 07:43:56.273624 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:43:56 crc kubenswrapper[4895]: E1202 07:43:56.273909 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:44:00.273876035 +0000 UTC m=+1251.444735658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.305538 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-74rn2"] Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.307831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.311565 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.311856 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.317724 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.320193 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-74rn2"] Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475590 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475648 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt66t\" (UniqueName: \"kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475670 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475728 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475798 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.475842 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578434 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578641 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt66t\" (UniqueName: \"kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578767 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.578830 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.579385 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.579807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.580842 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.584863 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.585127 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.585168 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.594802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt66t\" (UniqueName: \"kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t\") pod \"swift-ring-rebalance-74rn2\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.628454 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.696327 4895 generic.go:334] "Generic (PLEG): container finished" podID="ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" containerID="15c0a8a60d6e51b79c5a48224057195f23217f2a902e4569436fc6187c88a4ee" exitCode=0 Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.696422 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbxdg" event={"ID":"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd","Type":"ContainerDied","Data":"15c0a8a60d6e51b79c5a48224057195f23217f2a902e4569436fc6187c88a4ee"} Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.709337 4895 generic.go:334] "Generic (PLEG): container finished" podID="9f38520b-350c-4c3c-9bd2-48bf3c492299" containerID="7ae3e5d5ec8de27bf0dd2f2d640b40e84c4d95a0e1317e0fee0317f8b1e9f187" exitCode=0 Dec 02 07:43:56 crc kubenswrapper[4895]: I1202 07:43:56.709392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5583-account-create-update-vwhn6" event={"ID":"9f38520b-350c-4c3c-9bd2-48bf3c492299","Type":"ContainerDied","Data":"7ae3e5d5ec8de27bf0dd2f2d640b40e84c4d95a0e1317e0fee0317f8b1e9f187"} Dec 02 07:43:57 crc kubenswrapper[4895]: I1202 07:43:57.189123 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-74rn2"] Dec 02 07:43:57 crc kubenswrapper[4895]: W1202 07:43:57.210776 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaa608fc_52f9_426b_aca3_610fe5e245e0.slice/crio-b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3 WatchSource:0}: Error finding container b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3: Status 404 returned error can't find the container with id b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3 Dec 02 07:43:57 crc kubenswrapper[4895]: I1202 07:43:57.719624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-74rn2" event={"ID":"caa608fc-52f9-426b-aca3-610fe5e245e0","Type":"ContainerStarted","Data":"b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3"} Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.138006 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.228129 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.231582 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9xjj\" (UniqueName: \"kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj\") pod \"9f38520b-350c-4c3c-9bd2-48bf3c492299\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.231875 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts\") pod \"9f38520b-350c-4c3c-9bd2-48bf3c492299\" (UID: \"9f38520b-350c-4c3c-9bd2-48bf3c492299\") " Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.233101 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f38520b-350c-4c3c-9bd2-48bf3c492299" (UID: "9f38520b-350c-4c3c-9bd2-48bf3c492299"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.240090 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj" (OuterVolumeSpecName: "kube-api-access-k9xjj") pod "9f38520b-350c-4c3c-9bd2-48bf3c492299" (UID: "9f38520b-350c-4c3c-9bd2-48bf3c492299"). InnerVolumeSpecName "kube-api-access-k9xjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.332963 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts\") pod \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.333157 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwqj7\" (UniqueName: \"kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7\") pod \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\" (UID: \"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd\") " Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.333653 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9xjj\" (UniqueName: \"kubernetes.io/projected/9f38520b-350c-4c3c-9bd2-48bf3c492299-kube-api-access-k9xjj\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.333676 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f38520b-350c-4c3c-9bd2-48bf3c492299-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.333860 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" (UID: "ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.337480 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7" (OuterVolumeSpecName: "kube-api-access-bwqj7") pod "ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" (UID: "ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd"). InnerVolumeSpecName "kube-api-access-bwqj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.437038 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.437116 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwqj7\" (UniqueName: \"kubernetes.io/projected/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd-kube-api-access-bwqj7\") on node \"crc\" DevicePath \"\"" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.744937 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5583-account-create-update-vwhn6" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.747792 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5583-account-create-update-vwhn6" event={"ID":"9f38520b-350c-4c3c-9bd2-48bf3c492299","Type":"ContainerDied","Data":"d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc"} Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.747912 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6dc95012490c1debaccfe270a3867b6f5e767deb04bb1628c04d94817ab1ecc" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.750295 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nbxdg" event={"ID":"ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd","Type":"ContainerDied","Data":"930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528"} Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.750319 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930a34b0a7a6cb92ef83118765bd9350738b9ad8ab74be03686c5356f812e528" Dec 02 07:43:58 crc kubenswrapper[4895]: I1202 07:43:58.750384 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nbxdg" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.480897 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vjgr8"] Dec 02 07:43:59 crc kubenswrapper[4895]: E1202 07:43:59.481683 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f38520b-350c-4c3c-9bd2-48bf3c492299" containerName="mariadb-account-create-update" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.481703 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f38520b-350c-4c3c-9bd2-48bf3c492299" containerName="mariadb-account-create-update" Dec 02 07:43:59 crc kubenswrapper[4895]: E1202 07:43:59.481731 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" containerName="mariadb-database-create" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.481754 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" containerName="mariadb-database-create" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.481955 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f38520b-350c-4c3c-9bd2-48bf3c492299" containerName="mariadb-account-create-update" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.481985 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" containerName="mariadb-database-create" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.482798 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.486027 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hwlmx" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.486150 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.506230 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjgr8"] Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.662528 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.662590 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.662662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgqxt\" (UniqueName: \"kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.662707 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.764797 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgqxt\" (UniqueName: \"kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.764867 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.764943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.764978 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.770878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.774222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.774442 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.785410 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgqxt\" (UniqueName: \"kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt\") pod \"glance-db-sync-vjgr8\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.806545 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjgr8" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.823160 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ftfwq" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" probeResult="failure" output=< Dec 02 07:43:59 crc kubenswrapper[4895]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 07:43:59 crc kubenswrapper[4895]: > Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.916804 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:43:59 crc kubenswrapper[4895]: I1202 07:43:59.931708 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.172880 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ftfwq-config-brsls"] Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.174292 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.177933 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.183496 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ftfwq-config-brsls"] Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.278485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.278558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.278614 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.278656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.278700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.279812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: E1202 07:44:00.279836 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:44:00 crc kubenswrapper[4895]: E1202 07:44:00.279869 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:44:00 crc kubenswrapper[4895]: E1202 07:44:00.279931 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:44:08.279905675 +0000 UTC m=+1259.450765358 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.279966 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhkn\" (UniqueName: \"kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.348867 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vjgr8"] Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382238 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382366 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhkn\" (UniqueName: \"kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382464 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382500 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382608 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382649 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382691 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.382673 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.384138 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.386179 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.419837 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhkn\" (UniqueName: \"kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn\") pod \"ovn-controller-ftfwq-config-brsls\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:00 crc kubenswrapper[4895]: I1202 07:44:00.505704 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:01 crc kubenswrapper[4895]: I1202 07:44:01.481979 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:44:01 crc kubenswrapper[4895]: I1202 07:44:01.538626 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:44:01 crc kubenswrapper[4895]: I1202 07:44:01.539561 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="dnsmasq-dns" containerID="cri-o://5ef88bfacba00b39e539f67d36ac63bdb6060f919e164fdac86607561083daf4" gracePeriod=10 Dec 02 07:44:01 crc kubenswrapper[4895]: I1202 07:44:01.825439 4895 generic.go:334] "Generic (PLEG): container finished" podID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerID="5ef88bfacba00b39e539f67d36ac63bdb6060f919e164fdac86607561083daf4" exitCode=0 Dec 02 07:44:01 crc kubenswrapper[4895]: I1202 07:44:01.825531 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" event={"ID":"b37be9e7-783d-4bd7-a9fc-41049311cab8","Type":"ContainerDied","Data":"5ef88bfacba00b39e539f67d36ac63bdb6060f919e164fdac86607561083daf4"} Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.715061 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.835131 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-74rn2" event={"ID":"caa608fc-52f9-426b-aca3-610fe5e245e0","Type":"ContainerStarted","Data":"7403a5b9ce852233942b59a898f050425071a9defb4aef479e474d871e9de273"} Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.837551 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.837968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wlqps" event={"ID":"b37be9e7-783d-4bd7-a9fc-41049311cab8","Type":"ContainerDied","Data":"4093ffd564c25627cdd93db6f07a5a23db54120b07ad150be9d272b8596f3dad"} Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.838022 4895 scope.go:117] "RemoveContainer" containerID="5ef88bfacba00b39e539f67d36ac63bdb6060f919e164fdac86607561083daf4" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.838934 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config\") pod \"b37be9e7-783d-4bd7-a9fc-41049311cab8\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.839180 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb\") pod \"b37be9e7-783d-4bd7-a9fc-41049311cab8\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.839340 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb\") pod \"b37be9e7-783d-4bd7-a9fc-41049311cab8\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.839444 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb8zf\" (UniqueName: \"kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf\") pod \"b37be9e7-783d-4bd7-a9fc-41049311cab8\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.839493 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc\") pod \"b37be9e7-783d-4bd7-a9fc-41049311cab8\" (UID: \"b37be9e7-783d-4bd7-a9fc-41049311cab8\") " Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.842400 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjgr8" event={"ID":"a756fe09-2c73-430d-be27-34caa885311c","Type":"ContainerStarted","Data":"efe2dec243970af3fd6db4dc8eb44ed2f10c21509a6e77e373baa3a86b918f46"} Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.844544 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf" (OuterVolumeSpecName: "kube-api-access-fb8zf") pod "b37be9e7-783d-4bd7-a9fc-41049311cab8" (UID: "b37be9e7-783d-4bd7-a9fc-41049311cab8"). InnerVolumeSpecName "kube-api-access-fb8zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.858953 4895 scope.go:117] "RemoveContainer" containerID="9e17a81ab311b0c42a6251dd35556fa133111b3838b913a3426967b030929e27" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.866800 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-74rn2" podStartSLOduration=1.5287310729999999 podStartE2EDuration="6.866777641s" podCreationTimestamp="2025-12-02 07:43:56 +0000 UTC" firstStartedPulling="2025-12-02 07:43:57.214852373 +0000 UTC m=+1248.385711986" lastFinishedPulling="2025-12-02 07:44:02.552898941 +0000 UTC m=+1253.723758554" observedRunningTime="2025-12-02 07:44:02.85607151 +0000 UTC m=+1254.026931153" watchObservedRunningTime="2025-12-02 07:44:02.866777641 +0000 UTC m=+1254.037637264" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.884109 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b37be9e7-783d-4bd7-a9fc-41049311cab8" (UID: "b37be9e7-783d-4bd7-a9fc-41049311cab8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.885102 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b37be9e7-783d-4bd7-a9fc-41049311cab8" (UID: "b37be9e7-783d-4bd7-a9fc-41049311cab8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.886633 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b37be9e7-783d-4bd7-a9fc-41049311cab8" (UID: "b37be9e7-783d-4bd7-a9fc-41049311cab8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.891329 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config" (OuterVolumeSpecName: "config") pod "b37be9e7-783d-4bd7-a9fc-41049311cab8" (UID: "b37be9e7-783d-4bd7-a9fc-41049311cab8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.920950 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ftfwq-config-brsls"] Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.942097 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.942134 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.942145 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.942179 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb8zf\" (UniqueName: \"kubernetes.io/projected/b37be9e7-783d-4bd7-a9fc-41049311cab8-kube-api-access-fb8zf\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:02 crc kubenswrapper[4895]: I1202 07:44:02.942202 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b37be9e7-783d-4bd7-a9fc-41049311cab8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:03 crc kubenswrapper[4895]: I1202 07:44:03.178361 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:44:03 crc kubenswrapper[4895]: I1202 07:44:03.186176 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wlqps"] Dec 02 07:44:03 crc kubenswrapper[4895]: I1202 07:44:03.872403 4895 generic.go:334] "Generic (PLEG): container finished" podID="95e1ff1e-d711-41e1-bff9-a78f36d115c4" containerID="699083b59cc3e89c8cdcea80a7f38a966d522c9c52625be50b3fa816e59f7830" exitCode=0 Dec 02 07:44:03 crc kubenswrapper[4895]: I1202 07:44:03.872591 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq-config-brsls" event={"ID":"95e1ff1e-d711-41e1-bff9-a78f36d115c4","Type":"ContainerDied","Data":"699083b59cc3e89c8cdcea80a7f38a966d522c9c52625be50b3fa816e59f7830"} Dec 02 07:44:03 crc kubenswrapper[4895]: I1202 07:44:03.872803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq-config-brsls" event={"ID":"95e1ff1e-d711-41e1-bff9-a78f36d115c4","Type":"ContainerStarted","Data":"459c84e5d3946ad90f2d6f6d2bc2ca45161ffe14fc24ebe465ddac525e0ef921"} Dec 02 07:44:04 crc kubenswrapper[4895]: I1202 07:44:04.806913 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ftfwq" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.174860 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" path="/var/lib/kubelet/pods/b37be9e7-783d-4bd7-a9fc-41049311cab8/volumes" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.303392 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400406 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400504 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhhkn\" (UniqueName: \"kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400552 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400606 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.400727 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn\") pod \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\" (UID: \"95e1ff1e-d711-41e1-bff9-a78f36d115c4\") " Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.401048 4895 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.401088 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.401123 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run" (OuterVolumeSpecName: "var-run") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.401952 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.402048 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts" (OuterVolumeSpecName: "scripts") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.416061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn" (OuterVolumeSpecName: "kube-api-access-mhhkn") pod "95e1ff1e-d711-41e1-bff9-a78f36d115c4" (UID: "95e1ff1e-d711-41e1-bff9-a78f36d115c4"). InnerVolumeSpecName "kube-api-access-mhhkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.503468 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.503504 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhhkn\" (UniqueName: \"kubernetes.io/projected/95e1ff1e-d711-41e1-bff9-a78f36d115c4-kube-api-access-mhhkn\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.503520 4895 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/95e1ff1e-d711-41e1-bff9-a78f36d115c4-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.503530 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.503541 4895 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95e1ff1e-d711-41e1-bff9-a78f36d115c4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.896867 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq-config-brsls" event={"ID":"95e1ff1e-d711-41e1-bff9-a78f36d115c4","Type":"ContainerDied","Data":"459c84e5d3946ad90f2d6f6d2bc2ca45161ffe14fc24ebe465ddac525e0ef921"} Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.896931 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459c84e5d3946ad90f2d6f6d2bc2ca45161ffe14fc24ebe465ddac525e0ef921" Dec 02 07:44:05 crc kubenswrapper[4895]: I1202 07:44:05.896949 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq-config-brsls" Dec 02 07:44:06 crc kubenswrapper[4895]: I1202 07:44:06.412872 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ftfwq-config-brsls"] Dec 02 07:44:06 crc kubenswrapper[4895]: I1202 07:44:06.422171 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ftfwq-config-brsls"] Dec 02 07:44:06 crc kubenswrapper[4895]: I1202 07:44:06.910017 4895 generic.go:334] "Generic (PLEG): container finished" podID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerID="39340c4fd973c571bd458064ab8a8ad372022cf6e584357ba7e6b31eaf6221a0" exitCode=0 Dec 02 07:44:06 crc kubenswrapper[4895]: I1202 07:44:06.910065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerDied","Data":"39340c4fd973c571bd458064ab8a8ad372022cf6e584357ba7e6b31eaf6221a0"} Dec 02 07:44:07 crc kubenswrapper[4895]: I1202 07:44:07.155005 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e1ff1e-d711-41e1-bff9-a78f36d115c4" path="/var/lib/kubelet/pods/95e1ff1e-d711-41e1-bff9-a78f36d115c4/volumes" Dec 02 07:44:07 crc kubenswrapper[4895]: I1202 07:44:07.986469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerStarted","Data":"825f000e90e467b37377e382a45ce9ec58ad6ced7e5a761f9a5ac0cc1b0ded3d"} Dec 02 07:44:07 crc kubenswrapper[4895]: I1202 07:44:07.986819 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:44:08 crc kubenswrapper[4895]: I1202 07:44:08.021993 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.540559278 podStartE2EDuration="1m24.02196968s" podCreationTimestamp="2025-12-02 07:42:44 +0000 UTC" firstStartedPulling="2025-12-02 07:42:46.412923952 +0000 UTC m=+1177.583783565" lastFinishedPulling="2025-12-02 07:43:32.894334354 +0000 UTC m=+1224.065193967" observedRunningTime="2025-12-02 07:44:08.016926234 +0000 UTC m=+1259.187785857" watchObservedRunningTime="2025-12-02 07:44:08.02196968 +0000 UTC m=+1259.192829293" Dec 02 07:44:08 crc kubenswrapper[4895]: I1202 07:44:08.379398 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:44:08 crc kubenswrapper[4895]: E1202 07:44:08.379632 4895 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 07:44:08 crc kubenswrapper[4895]: E1202 07:44:08.379896 4895 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 07:44:08 crc kubenswrapper[4895]: E1202 07:44:08.379972 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift podName:11b8ece5-4192-4e13-a1c7-86ed3c627ddf nodeName:}" failed. No retries permitted until 2025-12-02 07:44:24.379949899 +0000 UTC m=+1275.550809512 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift") pod "swift-storage-0" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf") : configmap "swift-ring-files" not found Dec 02 07:44:10 crc kubenswrapper[4895]: I1202 07:44:10.012640 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca98cba7-4127-4d25-a139-1a42224331f2" containerID="dae6ee95ef6df69cc075b37be3c7109ee4cab3f60c4bdf61b7793f530ffc9ab5" exitCode=0 Dec 02 07:44:10 crc kubenswrapper[4895]: I1202 07:44:10.012700 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerDied","Data":"dae6ee95ef6df69cc075b37be3c7109ee4cab3f60c4bdf61b7793f530ffc9ab5"} Dec 02 07:44:13 crc kubenswrapper[4895]: I1202 07:44:13.420621 4895 generic.go:334] "Generic (PLEG): container finished" podID="caa608fc-52f9-426b-aca3-610fe5e245e0" containerID="7403a5b9ce852233942b59a898f050425071a9defb4aef479e474d871e9de273" exitCode=0 Dec 02 07:44:13 crc kubenswrapper[4895]: I1202 07:44:13.421400 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-74rn2" event={"ID":"caa608fc-52f9-426b-aca3-610fe5e245e0","Type":"ContainerDied","Data":"7403a5b9ce852233942b59a898f050425071a9defb4aef479e474d871e9de273"} Dec 02 07:44:22 crc kubenswrapper[4895]: E1202 07:44:22.344576 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 02 07:44:22 crc kubenswrapper[4895]: E1202 07:44:22.345586 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgqxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-vjgr8_openstack(a756fe09-2c73-430d-be27-34caa885311c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:44:22 crc kubenswrapper[4895]: E1202 07:44:22.347391 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-vjgr8" podUID="a756fe09-2c73-430d-be27-34caa885311c" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.425318 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.539898 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt66t\" (UniqueName: \"kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.539954 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.540135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.540254 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.540317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.540373 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.540433 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf\") pod \"caa608fc-52f9-426b-aca3-610fe5e245e0\" (UID: \"caa608fc-52f9-426b-aca3-610fe5e245e0\") " Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.541194 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.541837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.549614 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-74rn2" event={"ID":"caa608fc-52f9-426b-aca3-610fe5e245e0","Type":"ContainerDied","Data":"b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3"} Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.549684 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b822290523b8198b00dfd70c4ef4bb7bf59eb4da444e58132c0bc284514729e3" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.549702 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-74rn2" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.556729 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: E1202 07:44:22.561714 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-vjgr8" podUID="a756fe09-2c73-430d-be27-34caa885311c" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.564280 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t" (OuterVolumeSpecName: "kube-api-access-rt66t") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "kube-api-access-rt66t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.572119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.584127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts" (OuterVolumeSpecName: "scripts") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.584588 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caa608fc-52f9-426b-aca3-610fe5e245e0" (UID: "caa608fc-52f9-426b-aca3-610fe5e245e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643528 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643578 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643596 4895 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643610 4895 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/caa608fc-52f9-426b-aca3-610fe5e245e0-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643622 4895 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/caa608fc-52f9-426b-aca3-610fe5e245e0-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643638 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt66t\" (UniqueName: \"kubernetes.io/projected/caa608fc-52f9-426b-aca3-610fe5e245e0-kube-api-access-rt66t\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:22 crc kubenswrapper[4895]: I1202 07:44:22.643650 4895 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/caa608fc-52f9-426b-aca3-610fe5e245e0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:23 crc kubenswrapper[4895]: I1202 07:44:23.564993 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerStarted","Data":"5d044ff799057808b8d67f79590923f9bd83b515bcd050be4b95fa7aeeb31f38"} Dec 02 07:44:23 crc kubenswrapper[4895]: I1202 07:44:23.565641 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 07:44:23 crc kubenswrapper[4895]: I1202 07:44:23.625993 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371937.228828 podStartE2EDuration="1m39.625946705s" podCreationTimestamp="2025-12-02 07:42:44 +0000 UTC" firstStartedPulling="2025-12-02 07:42:46.103541761 +0000 UTC m=+1177.274401374" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:23.593266227 +0000 UTC m=+1274.764125850" watchObservedRunningTime="2025-12-02 07:44:23.625946705 +0000 UTC m=+1274.796806358" Dec 02 07:44:24 crc kubenswrapper[4895]: I1202 07:44:24.481072 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:44:24 crc kubenswrapper[4895]: I1202 07:44:24.488310 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"swift-storage-0\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " pod="openstack/swift-storage-0" Dec 02 07:44:24 crc kubenswrapper[4895]: I1202 07:44:24.660727 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 07:44:25 crc kubenswrapper[4895]: I1202 07:44:25.277648 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:44:25 crc kubenswrapper[4895]: I1202 07:44:25.588592 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"cfa04424dbc0599f02e0955508bdf471dbc21a51487954c16d2f02e8491eeb11"} Dec 02 07:44:25 crc kubenswrapper[4895]: I1202 07:44:25.753043 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:44:27 crc kubenswrapper[4895]: I1202 07:44:27.607293 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4"} Dec 02 07:44:28 crc kubenswrapper[4895]: I1202 07:44:28.619622 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc"} Dec 02 07:44:28 crc kubenswrapper[4895]: I1202 07:44:28.620006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a"} Dec 02 07:44:28 crc kubenswrapper[4895]: I1202 07:44:28.620019 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8"} Dec 02 07:44:30 crc kubenswrapper[4895]: I1202 07:44:30.672772 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452"} Dec 02 07:44:30 crc kubenswrapper[4895]: I1202 07:44:30.675166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962"} Dec 02 07:44:31 crc kubenswrapper[4895]: I1202 07:44:31.691203 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d"} Dec 02 07:44:31 crc kubenswrapper[4895]: I1202 07:44:31.691722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8"} Dec 02 07:44:32 crc kubenswrapper[4895]: I1202 07:44:32.706925 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b"} Dec 02 07:44:33 crc kubenswrapper[4895]: I1202 07:44:33.729513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda"} Dec 02 07:44:33 crc kubenswrapper[4895]: I1202 07:44:33.730127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615"} Dec 02 07:44:33 crc kubenswrapper[4895]: I1202 07:44:33.730145 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4"} Dec 02 07:44:34 crc kubenswrapper[4895]: I1202 07:44:34.748886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0"} Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.434134 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.763122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e"} Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.763628 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerStarted","Data":"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4"} Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.790850 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sfwtc"] Dec 02 07:44:35 crc kubenswrapper[4895]: E1202 07:44:35.791260 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e1ff1e-d711-41e1-bff9-a78f36d115c4" containerName="ovn-config" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791280 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e1ff1e-d711-41e1-bff9-a78f36d115c4" containerName="ovn-config" Dec 02 07:44:35 crc kubenswrapper[4895]: E1202 07:44:35.791306 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="init" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791313 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="init" Dec 02 07:44:35 crc kubenswrapper[4895]: E1202 07:44:35.791327 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa608fc-52f9-426b-aca3-610fe5e245e0" containerName="swift-ring-rebalance" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791333 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa608fc-52f9-426b-aca3-610fe5e245e0" containerName="swift-ring-rebalance" Dec 02 07:44:35 crc kubenswrapper[4895]: E1202 07:44:35.791357 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="dnsmasq-dns" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791363 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="dnsmasq-dns" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791539 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa608fc-52f9-426b-aca3-610fe5e245e0" containerName="swift-ring-rebalance" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791551 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e1ff1e-d711-41e1-bff9-a78f36d115c4" containerName="ovn-config" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.791577 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37be9e7-783d-4bd7-a9fc-41049311cab8" containerName="dnsmasq-dns" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.792274 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.816167 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sfwtc"] Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.857342 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zgns\" (UniqueName: \"kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.857497 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.873905 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.670379029 podStartE2EDuration="44.873881854s" podCreationTimestamp="2025-12-02 07:43:51 +0000 UTC" firstStartedPulling="2025-12-02 07:44:25.273848734 +0000 UTC m=+1276.444708347" lastFinishedPulling="2025-12-02 07:44:32.477351559 +0000 UTC m=+1283.648211172" observedRunningTime="2025-12-02 07:44:35.867763025 +0000 UTC m=+1287.038622648" watchObservedRunningTime="2025-12-02 07:44:35.873881854 +0000 UTC m=+1287.044741467" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.910018 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4aa4-account-create-update-t4vvh"] Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.911349 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.916488 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.923815 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4aa4-account-create-update-t4vvh"] Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.969658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zgns\" (UniqueName: \"kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.969768 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.969799 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.969826 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5mg\" (UniqueName: \"kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.971036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.987701 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lgt57"] Dec 02 07:44:35 crc kubenswrapper[4895]: I1202 07:44:35.988930 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.011063 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lgt57"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.016787 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zgns\" (UniqueName: \"kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns\") pod \"barbican-db-create-sfwtc\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.072214 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.072308 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5mg\" (UniqueName: \"kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.072530 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.072585 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp29h\" (UniqueName: \"kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.073109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.091438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5mg\" (UniqueName: \"kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg\") pod \"barbican-4aa4-account-create-update-t4vvh\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.109983 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.235545 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.236668 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.236771 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp29h\" (UniqueName: \"kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.237419 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.292388 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp29h\" (UniqueName: \"kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h\") pod \"cinder-db-create-lgt57\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.296436 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-skg8w"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.306187 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.307103 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.315367 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-skg8w"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.339315 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7d85-account-create-update-bdsdf"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.339513 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.339615 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w24jj\" (UniqueName: \"kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.340966 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.350219 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.418827 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7d85-account-create-update-bdsdf"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.443730 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.443850 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w24jj\" (UniqueName: \"kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.445289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.491353 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tx72w"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.495669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w24jj\" (UniqueName: \"kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj\") pod \"neutron-db-create-skg8w\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.496437 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.509750 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tx72w"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.510248 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.510469 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.510657 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qm9nx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.511009 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.518464 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.525940 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.529595 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5a3b-account-create-update-ztphx"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.530958 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.538272 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.549235 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.550179 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.550273 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5hkq\" (UniqueName: \"kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.571153 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.585902 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5a3b-account-create-update-ztphx"] Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4rw\" (UniqueName: \"kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654779 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654809 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjmd\" (UniqueName: \"kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.654993 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.655142 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656321 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5hkq\" (UniqueName: \"kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656851 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d927z\" (UniqueName: \"kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.656887 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.657971 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.672254 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.687801 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5hkq\" (UniqueName: \"kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq\") pod \"cinder-7d85-account-create-update-bdsdf\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759190 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759235 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d927z\" (UniqueName: \"kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759257 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759310 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4rw\" (UniqueName: \"kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759359 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759404 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjmd\" (UniqueName: \"kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759436 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759461 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.759492 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.760300 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.761189 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.762886 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.763058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.764045 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.764200 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.783905 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.788342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4rw\" (UniqueName: \"kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.791844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjmd\" (UniqueName: \"kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd\") pod \"neutron-5a3b-account-create-update-ztphx\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.795998 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data\") pod \"keystone-db-sync-tx72w\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.819381 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d927z\" (UniqueName: \"kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z\") pod \"dnsmasq-dns-764c5664d7-hdjr4\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.926240 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.952124 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:36 crc kubenswrapper[4895]: I1202 07:44:36.967101 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.003306 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.028191 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4aa4-account-create-update-t4vvh"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.195410 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sfwtc"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.195454 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-skg8w"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.195467 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lgt57"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.636517 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7d85-account-create-update-bdsdf"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.755003 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tx72w"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.784669 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.910076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" event={"ID":"5977660a-cdf5-4a65-8b8a-bbd944ec5736","Type":"ContainerStarted","Data":"c43e3762f2da55210e8057d2a79ba5a0c4dfeb49809b48005388ce6e3c733014"} Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.926011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4aa4-account-create-update-t4vvh" event={"ID":"137d4f28-0e97-4154-90f1-22426094ef5e","Type":"ContainerStarted","Data":"29cac239a36ec47979b458bd7cbe4f5af5cb9aa6e860de93677ffafa0b13e0a8"} Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.926089 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4aa4-account-create-update-t4vvh" event={"ID":"137d4f28-0e97-4154-90f1-22426094ef5e","Type":"ContainerStarted","Data":"b467d2ea9e92c4d103303fc85ca8eef2e0eee04d16f4f8ab425308b29f0f03b6"} Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.955682 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-skg8w" event={"ID":"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8","Type":"ContainerStarted","Data":"0cea6d2353c72398e48c1c3d3ded7a0154a5a87ae593e426abedaa9a74830e8a"} Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.956083 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-skg8w" event={"ID":"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8","Type":"ContainerStarted","Data":"204ad33bc58727a14c4e85b947b8fdccf25debb7de3065b53d0dab1300daa59c"} Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.965762 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5a3b-account-create-update-ztphx"] Dec 02 07:44:37 crc kubenswrapper[4895]: I1202 07:44:37.988205 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-4aa4-account-create-update-t4vvh" podStartSLOduration=2.988172805 podStartE2EDuration="2.988172805s" podCreationTimestamp="2025-12-02 07:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:37.974349229 +0000 UTC m=+1289.145208842" watchObservedRunningTime="2025-12-02 07:44:37.988172805 +0000 UTC m=+1289.159032408" Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.001279 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7d85-account-create-update-bdsdf" event={"ID":"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3","Type":"ContainerStarted","Data":"bd9cfa6a280ac2d470cecf161e6334adbad95ae2edf82ec76c843ff58a9eb20e"} Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.019376 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lgt57" event={"ID":"915ca98c-6878-4b7a-ba75-75ab97ce5900","Type":"ContainerStarted","Data":"ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1"} Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.020515 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lgt57" event={"ID":"915ca98c-6878-4b7a-ba75-75ab97ce5900","Type":"ContainerStarted","Data":"4a2a86ab55bc85840944e1b376c8085d9d5d2732b18925f04861fe72184ecec0"} Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.041095 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sfwtc" event={"ID":"66d2c26b-ed57-435f-845e-e4d51a4d9aa3","Type":"ContainerStarted","Data":"07083e71540680643e55a6b2c8400f1fab96294f90701438ad40cbeb3539c27f"} Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.041152 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sfwtc" event={"ID":"66d2c26b-ed57-435f-845e-e4d51a4d9aa3","Type":"ContainerStarted","Data":"c98ac889f08e8980537c847e65adb202153bbc00a4c9c8bfa5c410ea14a44ec4"} Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.052231 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-skg8w" podStartSLOduration=2.05220766 podStartE2EDuration="2.05220766s" podCreationTimestamp="2025-12-02 07:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:38.040483959 +0000 UTC m=+1289.211343562" watchObservedRunningTime="2025-12-02 07:44:38.05220766 +0000 UTC m=+1289.223067263" Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.117388 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-lgt57" podStartSLOduration=3.117370139 podStartE2EDuration="3.117370139s" podCreationTimestamp="2025-12-02 07:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:38.107360271 +0000 UTC m=+1289.278219884" watchObservedRunningTime="2025-12-02 07:44:38.117370139 +0000 UTC m=+1289.288229752" Dec 02 07:44:38 crc kubenswrapper[4895]: I1202 07:44:38.169723 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-sfwtc" podStartSLOduration=3.169700063 podStartE2EDuration="3.169700063s" podCreationTimestamp="2025-12-02 07:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:38.166256798 +0000 UTC m=+1289.337116411" watchObservedRunningTime="2025-12-02 07:44:38.169700063 +0000 UTC m=+1289.340559686" Dec 02 07:44:38 crc kubenswrapper[4895]: E1202 07:44:38.515791 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915ca98c_6878_4b7a_ba75_75ab97ce5900.slice/crio-ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915ca98c_6878_4b7a_ba75_75ab97ce5900.slice/crio-conmon-ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137d4f28_0e97_4154_90f1_22426094ef5e.slice/crio-conmon-29cac239a36ec47979b458bd7cbe4f5af5cb9aa6e860de93677ffafa0b13e0a8.scope\": RecentStats: unable to find data in memory cache]" Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.053342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tx72w" event={"ID":"ecfcde89-8c82-43c8-b59b-4145640a2737","Type":"ContainerStarted","Data":"beb620f214ce8361718139c39cd662f28d3075378bff36006ef68b2088b167d4"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.058943 4895 generic.go:334] "Generic (PLEG): container finished" podID="2e30fe62-fac0-425f-ba6f-277033d652d1" containerID="c280f2831c6a36b2ad18f9301b8a0472b08702b0278b211544534ea69aa1a406" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.059082 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5a3b-account-create-update-ztphx" event={"ID":"2e30fe62-fac0-425f-ba6f-277033d652d1","Type":"ContainerDied","Data":"c280f2831c6a36b2ad18f9301b8a0472b08702b0278b211544534ea69aa1a406"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.059126 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5a3b-account-create-update-ztphx" event={"ID":"2e30fe62-fac0-425f-ba6f-277033d652d1","Type":"ContainerStarted","Data":"3ee2dbdb015f2f4aed494dbf143a8c9b0846051c174e1b3394453c77036fd637"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.062085 4895 generic.go:334] "Generic (PLEG): container finished" podID="41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" containerID="7502623bd7676b5c87fc1f5d8cba25df3796cbe037e5edeae59bd59e11e37241" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.062204 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7d85-account-create-update-bdsdf" event={"ID":"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3","Type":"ContainerDied","Data":"7502623bd7676b5c87fc1f5d8cba25df3796cbe037e5edeae59bd59e11e37241"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.067260 4895 generic.go:334] "Generic (PLEG): container finished" podID="915ca98c-6878-4b7a-ba75-75ab97ce5900" containerID="ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.067384 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lgt57" event={"ID":"915ca98c-6878-4b7a-ba75-75ab97ce5900","Type":"ContainerDied","Data":"ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.115316 4895 generic.go:334] "Generic (PLEG): container finished" podID="66d2c26b-ed57-435f-845e-e4d51a4d9aa3" containerID="07083e71540680643e55a6b2c8400f1fab96294f90701438ad40cbeb3539c27f" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.115467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sfwtc" event={"ID":"66d2c26b-ed57-435f-845e-e4d51a4d9aa3","Type":"ContainerDied","Data":"07083e71540680643e55a6b2c8400f1fab96294f90701438ad40cbeb3539c27f"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.135958 4895 generic.go:334] "Generic (PLEG): container finished" podID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerID="6b964e030987aeb72ab4f97c379e8c654950a5733d9ac70a637495e2e188905b" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.136494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" event={"ID":"5977660a-cdf5-4a65-8b8a-bbd944ec5736","Type":"ContainerDied","Data":"6b964e030987aeb72ab4f97c379e8c654950a5733d9ac70a637495e2e188905b"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.145859 4895 generic.go:334] "Generic (PLEG): container finished" podID="137d4f28-0e97-4154-90f1-22426094ef5e" containerID="29cac239a36ec47979b458bd7cbe4f5af5cb9aa6e860de93677ffafa0b13e0a8" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.152147 4895 generic.go:334] "Generic (PLEG): container finished" podID="ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" containerID="0cea6d2353c72398e48c1c3d3ded7a0154a5a87ae593e426abedaa9a74830e8a" exitCode=0 Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.172220 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4aa4-account-create-update-t4vvh" event={"ID":"137d4f28-0e97-4154-90f1-22426094ef5e","Type":"ContainerDied","Data":"29cac239a36ec47979b458bd7cbe4f5af5cb9aa6e860de93677ffafa0b13e0a8"} Dec 02 07:44:39 crc kubenswrapper[4895]: I1202 07:44:39.172303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-skg8w" event={"ID":"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8","Type":"ContainerDied","Data":"0cea6d2353c72398e48c1c3d3ded7a0154a5a87ae593e426abedaa9a74830e8a"} Dec 02 07:44:40 crc kubenswrapper[4895]: I1202 07:44:40.179525 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjgr8" event={"ID":"a756fe09-2c73-430d-be27-34caa885311c","Type":"ContainerStarted","Data":"25d16ba51aa46d2a8ba22560c34b650e3b46eec242185ecc21922126defa6823"} Dec 02 07:44:40 crc kubenswrapper[4895]: I1202 07:44:40.182231 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" event={"ID":"5977660a-cdf5-4a65-8b8a-bbd944ec5736","Type":"ContainerStarted","Data":"9a905c0785603a08aafb95096d190ca7e7d02066b5f4d4bcda08e56d643dd1a7"} Dec 02 07:44:40 crc kubenswrapper[4895]: I1202 07:44:40.205379 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vjgr8" podStartSLOduration=4.891903074 podStartE2EDuration="41.20534947s" podCreationTimestamp="2025-12-02 07:43:59 +0000 UTC" firstStartedPulling="2025-12-02 07:44:02.350550711 +0000 UTC m=+1253.521410324" lastFinishedPulling="2025-12-02 07:44:38.663997107 +0000 UTC m=+1289.834856720" observedRunningTime="2025-12-02 07:44:40.204451283 +0000 UTC m=+1291.375310926" watchObservedRunningTime="2025-12-02 07:44:40.20534947 +0000 UTC m=+1291.376209083" Dec 02 07:44:41 crc kubenswrapper[4895]: I1202 07:44:41.195561 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:43 crc kubenswrapper[4895]: I1202 07:44:43.959835 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:43 crc kubenswrapper[4895]: I1202 07:44:43.972622 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:43 crc kubenswrapper[4895]: I1202 07:44:43.987861 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" podStartSLOduration=7.987824706 podStartE2EDuration="7.987824706s" podCreationTimestamp="2025-12-02 07:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:40.238496013 +0000 UTC m=+1291.409355646" watchObservedRunningTime="2025-12-02 07:44:43.987824706 +0000 UTC m=+1295.158684329" Dec 02 07:44:43 crc kubenswrapper[4895]: I1202 07:44:43.990543 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.001047 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.014844 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.023885 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.027512 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts\") pod \"2e30fe62-fac0-425f-ba6f-277033d652d1\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.027705 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5mg\" (UniqueName: \"kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg\") pod \"137d4f28-0e97-4154-90f1-22426094ef5e\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.027751 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zgns\" (UniqueName: \"kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns\") pod \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.027805 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts\") pod \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\" (UID: \"66d2c26b-ed57-435f-845e-e4d51a4d9aa3\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.027869 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjmd\" (UniqueName: \"kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd\") pod \"2e30fe62-fac0-425f-ba6f-277033d652d1\" (UID: \"2e30fe62-fac0-425f-ba6f-277033d652d1\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028314 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e30fe62-fac0-425f-ba6f-277033d652d1" (UID: "2e30fe62-fac0-425f-ba6f-277033d652d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66d2c26b-ed57-435f-845e-e4d51a4d9aa3" (UID: "66d2c26b-ed57-435f-845e-e4d51a4d9aa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028613 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w24jj\" (UniqueName: \"kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj\") pod \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts\") pod \"915ca98c-6878-4b7a-ba75-75ab97ce5900\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028816 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp29h\" (UniqueName: \"kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h\") pod \"915ca98c-6878-4b7a-ba75-75ab97ce5900\" (UID: \"915ca98c-6878-4b7a-ba75-75ab97ce5900\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028860 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts\") pod \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\" (UID: \"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.028960 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts\") pod \"137d4f28-0e97-4154-90f1-22426094ef5e\" (UID: \"137d4f28-0e97-4154-90f1-22426094ef5e\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.029199 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "915ca98c-6878-4b7a-ba75-75ab97ce5900" (UID: "915ca98c-6878-4b7a-ba75-75ab97ce5900"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.029896 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e30fe62-fac0-425f-ba6f-277033d652d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.029916 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.029926 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915ca98c-6878-4b7a-ba75-75ab97ce5900-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.030287 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "137d4f28-0e97-4154-90f1-22426094ef5e" (UID: "137d4f28-0e97-4154-90f1-22426094ef5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.036959 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h" (OuterVolumeSpecName: "kube-api-access-hp29h") pod "915ca98c-6878-4b7a-ba75-75ab97ce5900" (UID: "915ca98c-6878-4b7a-ba75-75ab97ce5900"). InnerVolumeSpecName "kube-api-access-hp29h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.050170 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj" (OuterVolumeSpecName: "kube-api-access-w24jj") pod "ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" (UID: "ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8"). InnerVolumeSpecName "kube-api-access-w24jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.050355 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd" (OuterVolumeSpecName: "kube-api-access-7xjmd") pod "2e30fe62-fac0-425f-ba6f-277033d652d1" (UID: "2e30fe62-fac0-425f-ba6f-277033d652d1"). InnerVolumeSpecName "kube-api-access-7xjmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.050441 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns" (OuterVolumeSpecName: "kube-api-access-8zgns") pod "66d2c26b-ed57-435f-845e-e4d51a4d9aa3" (UID: "66d2c26b-ed57-435f-845e-e4d51a4d9aa3"). InnerVolumeSpecName "kube-api-access-8zgns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.052022 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" (UID: "ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.053423 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg" (OuterVolumeSpecName: "kube-api-access-kv5mg") pod "137d4f28-0e97-4154-90f1-22426094ef5e" (UID: "137d4f28-0e97-4154-90f1-22426094ef5e"). InnerVolumeSpecName "kube-api-access-kv5mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.131300 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5hkq\" (UniqueName: \"kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq\") pod \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.131405 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts\") pod \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\" (UID: \"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3\") " Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132007 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137d4f28-0e97-4154-90f1-22426094ef5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132035 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5mg\" (UniqueName: \"kubernetes.io/projected/137d4f28-0e97-4154-90f1-22426094ef5e-kube-api-access-kv5mg\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132051 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zgns\" (UniqueName: \"kubernetes.io/projected/66d2c26b-ed57-435f-845e-e4d51a4d9aa3-kube-api-access-8zgns\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132069 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjmd\" (UniqueName: \"kubernetes.io/projected/2e30fe62-fac0-425f-ba6f-277033d652d1-kube-api-access-7xjmd\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132082 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w24jj\" (UniqueName: \"kubernetes.io/projected/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-kube-api-access-w24jj\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132096 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp29h\" (UniqueName: \"kubernetes.io/projected/915ca98c-6878-4b7a-ba75-75ab97ce5900-kube-api-access-hp29h\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132109 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.132856 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" (UID: "41f5b5b1-5555-42cd-b212-71f5e6c5d0c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.136582 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq" (OuterVolumeSpecName: "kube-api-access-w5hkq") pod "41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" (UID: "41f5b5b1-5555-42cd-b212-71f5e6c5d0c3"). InnerVolumeSpecName "kube-api-access-w5hkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.277033 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5hkq\" (UniqueName: \"kubernetes.io/projected/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-kube-api-access-w5hkq\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.277078 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.285732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tx72w" event={"ID":"ecfcde89-8c82-43c8-b59b-4145640a2737","Type":"ContainerStarted","Data":"34ab968b34804011274e618923c761b412f63061861d97d3ae783f4629e6063e"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.296026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5a3b-account-create-update-ztphx" event={"ID":"2e30fe62-fac0-425f-ba6f-277033d652d1","Type":"ContainerDied","Data":"3ee2dbdb015f2f4aed494dbf143a8c9b0846051c174e1b3394453c77036fd637"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.296102 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee2dbdb015f2f4aed494dbf143a8c9b0846051c174e1b3394453c77036fd637" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.296141 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5a3b-account-create-update-ztphx" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.302036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7d85-account-create-update-bdsdf" event={"ID":"41f5b5b1-5555-42cd-b212-71f5e6c5d0c3","Type":"ContainerDied","Data":"bd9cfa6a280ac2d470cecf161e6334adbad95ae2edf82ec76c843ff58a9eb20e"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.302084 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd9cfa6a280ac2d470cecf161e6334adbad95ae2edf82ec76c843ff58a9eb20e" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.302163 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7d85-account-create-update-bdsdf" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.309953 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lgt57" event={"ID":"915ca98c-6878-4b7a-ba75-75ab97ce5900","Type":"ContainerDied","Data":"4a2a86ab55bc85840944e1b376c8085d9d5d2732b18925f04861fe72184ecec0"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.310017 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a2a86ab55bc85840944e1b376c8085d9d5d2732b18925f04861fe72184ecec0" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.310116 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lgt57" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.319029 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sfwtc" event={"ID":"66d2c26b-ed57-435f-845e-e4d51a4d9aa3","Type":"ContainerDied","Data":"c98ac889f08e8980537c847e65adb202153bbc00a4c9c8bfa5c410ea14a44ec4"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.319079 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98ac889f08e8980537c847e65adb202153bbc00a4c9c8bfa5c410ea14a44ec4" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.319144 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sfwtc" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.321042 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tx72w" podStartSLOduration=2.434145009 podStartE2EDuration="8.321017872s" podCreationTimestamp="2025-12-02 07:44:36 +0000 UTC" firstStartedPulling="2025-12-02 07:44:37.895803307 +0000 UTC m=+1289.066662920" lastFinishedPulling="2025-12-02 07:44:43.78267616 +0000 UTC m=+1294.953535783" observedRunningTime="2025-12-02 07:44:44.315376748 +0000 UTC m=+1295.486236371" watchObservedRunningTime="2025-12-02 07:44:44.321017872 +0000 UTC m=+1295.491877485" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.323784 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4aa4-account-create-update-t4vvh" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.323772 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4aa4-account-create-update-t4vvh" event={"ID":"137d4f28-0e97-4154-90f1-22426094ef5e","Type":"ContainerDied","Data":"b467d2ea9e92c4d103303fc85ca8eef2e0eee04d16f4f8ab425308b29f0f03b6"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.324023 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b467d2ea9e92c4d103303fc85ca8eef2e0eee04d16f4f8ab425308b29f0f03b6" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.325053 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-skg8w" event={"ID":"ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8","Type":"ContainerDied","Data":"204ad33bc58727a14c4e85b947b8fdccf25debb7de3065b53d0dab1300daa59c"} Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.325077 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204ad33bc58727a14c4e85b947b8fdccf25debb7de3065b53d0dab1300daa59c" Dec 02 07:44:44 crc kubenswrapper[4895]: I1202 07:44:44.325123 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-skg8w" Dec 02 07:44:46 crc kubenswrapper[4895]: I1202 07:44:46.970001 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:44:47 crc kubenswrapper[4895]: I1202 07:44:47.029927 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:44:47 crc kubenswrapper[4895]: I1202 07:44:47.030573 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-lsqg6" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="dnsmasq-dns" containerID="cri-o://bfa758772dddbb174854624875a511c8200e31a9842ba6666340092013875a32" gracePeriod=10 Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.438481 4895 generic.go:334] "Generic (PLEG): container finished" podID="ecfcde89-8c82-43c8-b59b-4145640a2737" containerID="34ab968b34804011274e618923c761b412f63061861d97d3ae783f4629e6063e" exitCode=0 Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.438972 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tx72w" event={"ID":"ecfcde89-8c82-43c8-b59b-4145640a2737","Type":"ContainerDied","Data":"34ab968b34804011274e618923c761b412f63061861d97d3ae783f4629e6063e"} Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.442458 4895 generic.go:334] "Generic (PLEG): container finished" podID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerID="bfa758772dddbb174854624875a511c8200e31a9842ba6666340092013875a32" exitCode=0 Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.442478 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-lsqg6" event={"ID":"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50","Type":"ContainerDied","Data":"bfa758772dddbb174854624875a511c8200e31a9842ba6666340092013875a32"} Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.594000 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.756570 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb\") pod \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.756674 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb\") pod \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.756731 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz56z\" (UniqueName: \"kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z\") pod \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.756873 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc\") pod \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.756900 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config\") pod \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\" (UID: \"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.775358 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z" (OuterVolumeSpecName: "kube-api-access-cz56z") pod "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" (UID: "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50"). InnerVolumeSpecName "kube-api-access-cz56z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.805712 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" (UID: "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.805879 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config" (OuterVolumeSpecName: "config") pod "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" (UID: "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.825302 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" (UID: "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.834225 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" (UID: "b50f18a4-6f7c-4d40-ad68-e6e55c5edd50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.863523 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.863652 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.863683 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz56z\" (UniqueName: \"kubernetes.io/projected/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-kube-api-access-cz56z\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.863700 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:47.863716 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.453032 4895 generic.go:334] "Generic (PLEG): container finished" podID="a756fe09-2c73-430d-be27-34caa885311c" containerID="25d16ba51aa46d2a8ba22560c34b650e3b46eec242185ecc21922126defa6823" exitCode=0 Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.453103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjgr8" event={"ID":"a756fe09-2c73-430d-be27-34caa885311c","Type":"ContainerDied","Data":"25d16ba51aa46d2a8ba22560c34b650e3b46eec242185ecc21922126defa6823"} Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.457640 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-lsqg6" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.457834 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-lsqg6" event={"ID":"b50f18a4-6f7c-4d40-ad68-e6e55c5edd50","Type":"ContainerDied","Data":"29e0b10a742c61c1b44cf3e5b72aa94dd4c52460f24c3e4d2e74597698eead5c"} Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.457919 4895 scope.go:117] "RemoveContainer" containerID="bfa758772dddbb174854624875a511c8200e31a9842ba6666340092013875a32" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.506958 4895 scope.go:117] "RemoveContainer" containerID="3d5c5568581556187481cff77ef82cfda6eb170ba06edbc7ef83e95471ce7d63" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.510298 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.547406 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-lsqg6"] Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.880123 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.986513 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data\") pod \"ecfcde89-8c82-43c8-b59b-4145640a2737\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.986645 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle\") pod \"ecfcde89-8c82-43c8-b59b-4145640a2737\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.986698 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng4rw\" (UniqueName: \"kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw\") pod \"ecfcde89-8c82-43c8-b59b-4145640a2737\" (UID: \"ecfcde89-8c82-43c8-b59b-4145640a2737\") " Dec 02 07:44:48 crc kubenswrapper[4895]: I1202 07:44:48.993646 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw" (OuterVolumeSpecName: "kube-api-access-ng4rw") pod "ecfcde89-8c82-43c8-b59b-4145640a2737" (UID: "ecfcde89-8c82-43c8-b59b-4145640a2737"). InnerVolumeSpecName "kube-api-access-ng4rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.019219 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecfcde89-8c82-43c8-b59b-4145640a2737" (UID: "ecfcde89-8c82-43c8-b59b-4145640a2737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.038187 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data" (OuterVolumeSpecName: "config-data") pod "ecfcde89-8c82-43c8-b59b-4145640a2737" (UID: "ecfcde89-8c82-43c8-b59b-4145640a2737"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.089506 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.091151 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng4rw\" (UniqueName: \"kubernetes.io/projected/ecfcde89-8c82-43c8-b59b-4145640a2737-kube-api-access-ng4rw\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.091209 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcde89-8c82-43c8-b59b-4145640a2737-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.156702 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" path="/var/lib/kubelet/pods/b50f18a4-6f7c-4d40-ad68-e6e55c5edd50/volumes" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.499152 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tx72w" event={"ID":"ecfcde89-8c82-43c8-b59b-4145640a2737","Type":"ContainerDied","Data":"beb620f214ce8361718139c39cd662f28d3075378bff36006ef68b2088b167d4"} Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.499660 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beb620f214ce8361718139c39cd662f28d3075378bff36006ef68b2088b167d4" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.499212 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tx72w" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.708952 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709409 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915ca98c-6878-4b7a-ba75-75ab97ce5900" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709426 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="915ca98c-6878-4b7a-ba75-75ab97ce5900" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709443 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e30fe62-fac0-425f-ba6f-277033d652d1" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709450 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e30fe62-fac0-425f-ba6f-277033d652d1" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709466 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="dnsmasq-dns" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709471 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="dnsmasq-dns" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709483 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="init" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709490 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="init" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709504 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137d4f28-0e97-4154-90f1-22426094ef5e" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709511 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="137d4f28-0e97-4154-90f1-22426094ef5e" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709525 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d2c26b-ed57-435f-845e-e4d51a4d9aa3" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709531 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d2c26b-ed57-435f-845e-e4d51a4d9aa3" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709544 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709552 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709559 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfcde89-8c82-43c8-b59b-4145640a2737" containerName="keystone-db-sync" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709564 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfcde89-8c82-43c8-b59b-4145640a2737" containerName="keystone-db-sync" Dec 02 07:44:49 crc kubenswrapper[4895]: E1202 07:44:49.709585 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709591 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709786 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d2c26b-ed57-435f-845e-e4d51a4d9aa3" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709808 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e30fe62-fac0-425f-ba6f-277033d652d1" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709815 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="915ca98c-6878-4b7a-ba75-75ab97ce5900" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709828 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="137d4f28-0e97-4154-90f1-22426094ef5e" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709837 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" containerName="mariadb-database-create" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709849 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfcde89-8c82-43c8-b59b-4145640a2737" containerName="keystone-db-sync" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709858 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" containerName="mariadb-account-create-update" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.709869 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50f18a4-6f7c-4d40-ad68-e6e55c5edd50" containerName="dnsmasq-dns" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.710974 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.725927 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.740779 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-884pr"] Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.742687 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.745360 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qm9nx" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.745664 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.745826 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.745675 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.747976 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-884pr"] Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.759330 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.825775 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.825860 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.825882 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmkkp\" (UniqueName: \"kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.825914 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.825940 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826027 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826068 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826112 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbznt\" (UniqueName: \"kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826133 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.826151 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.909681 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9lrrh"] Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.911151 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.926048 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927485 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927505 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmkkp\" (UniqueName: \"kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927558 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927625 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927648 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927667 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927695 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927714 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbznt\" (UniqueName: \"kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927757 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.927779 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.928298 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xr2dd" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.928591 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.929026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.929646 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.929700 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.929867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.942209 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.944075 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.947614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.961254 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.967908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbznt\" (UniqueName: \"kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt\") pod \"dnsmasq-dns-5959f8865f-h77pb\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.972068 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.974255 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.982292 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9lrrh"] Dec 02 07:44:49 crc kubenswrapper[4895]: I1202 07:44:49.982954 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmkkp\" (UniqueName: \"kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp\") pod \"keystone-bootstrap-884pr\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.031220 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.031684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdk67\" (UniqueName: \"kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.031712 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.037143 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.056846 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-44vd8"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.058195 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.065815 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vbt2f" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.066086 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.066250 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.075230 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-884pr" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.086822 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-44vd8"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.135949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136018 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136080 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24hg5\" (UniqueName: \"kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136125 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136484 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdk67\" (UniqueName: \"kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.136560 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.139706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.161332 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.170110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.204103 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.211494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdk67\" (UniqueName: \"kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67\") pod \"neutron-db-sync-9lrrh\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.225556 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.227232 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242141 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242188 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242227 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242272 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242307 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24hg5\" (UniqueName: \"kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242355 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmmc\" (UniqueName: \"kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242405 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.242436 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.244919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.248406 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.256061 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.261997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.262499 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-prmpk"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.264318 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.280135 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.284774 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.285386 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-86fdd" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.285472 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.285517 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.287256 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.287676 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.288079 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.335206 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344179 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344371 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344395 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344433 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344463 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmmc\" (UniqueName: \"kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344487 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344513 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344561 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbg4\" (UniqueName: \"kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344582 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344612 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344654 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5ls\" (UniqueName: \"kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344685 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.344756 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.345845 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.346702 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.349149 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.349968 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.350531 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.350893 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.389893 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjgr8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.400336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmmc\" (UniqueName: \"kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc\") pod \"dnsmasq-dns-58dd9ff6bc-ljwg7\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.403321 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.419351 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24hg5\" (UniqueName: \"kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5\") pod \"cinder-db-sync-44vd8\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.449995 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data\") pod \"a756fe09-2c73-430d-be27-34caa885311c\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.471758 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgqxt\" (UniqueName: \"kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt\") pod \"a756fe09-2c73-430d-be27-34caa885311c\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.472018 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data\") pod \"a756fe09-2c73-430d-be27-34caa885311c\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.472108 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle\") pod \"a756fe09-2c73-430d-be27-34caa885311c\" (UID: \"a756fe09-2c73-430d-be27-34caa885311c\") " Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473177 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473236 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473344 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473483 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbg4\" (UniqueName: \"kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473530 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5ls\" (UniqueName: \"kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.473692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.475365 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.475486 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.475620 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.484608 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.484912 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.485190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.485303 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-prmpk"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.486932 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.498119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a756fe09-2c73-430d-be27-34caa885311c" (UID: "a756fe09-2c73-430d-be27-34caa885311c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.498409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.512395 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.512969 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xdfqx"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.513731 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.513842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt" (OuterVolumeSpecName: "kube-api-access-zgqxt") pod "a756fe09-2c73-430d-be27-34caa885311c" (UID: "a756fe09-2c73-430d-be27-34caa885311c"). InnerVolumeSpecName "kube-api-access-zgqxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:50 crc kubenswrapper[4895]: E1202 07:44:50.515832 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a756fe09-2c73-430d-be27-34caa885311c" containerName="glance-db-sync" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.515856 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a756fe09-2c73-430d-be27-34caa885311c" containerName="glance-db-sync" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.516248 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a756fe09-2c73-430d-be27-34caa885311c" containerName="glance-db-sync" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.517446 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.527100 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a756fe09-2c73-430d-be27-34caa885311c" (UID: "a756fe09-2c73-430d-be27-34caa885311c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.536102 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.536378 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8wlh5" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.547183 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.547400 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vjgr8" event={"ID":"a756fe09-2c73-430d-be27-34caa885311c","Type":"ContainerDied","Data":"efe2dec243970af3fd6db4dc8eb44ed2f10c21509a6e77e373baa3a86b918f46"} Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.547426 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.547446 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe2dec243970af3fd6db4dc8eb44ed2f10c21509a6e77e373baa3a86b918f46" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.547591 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vjgr8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.569701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.570617 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbg4\" (UniqueName: \"kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4\") pod \"ceilometer-0\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.571033 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5ls\" (UniqueName: \"kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls\") pod \"placement-db-sync-prmpk\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.579566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.579671 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6nh\" (UniqueName: \"kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.579812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.584959 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.587214 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgqxt\" (UniqueName: \"kubernetes.io/projected/a756fe09-2c73-430d-be27-34caa885311c-kube-api-access-zgqxt\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.587261 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.613823 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-44vd8" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.618823 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xdfqx"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.626767 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.651417 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-prmpk" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.696197 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.696259 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6nh\" (UniqueName: \"kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.696328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.712818 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.727583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.762096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6nh\" (UniqueName: \"kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh\") pod \"barbican-db-sync-xdfqx\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.765998 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.803013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data" (OuterVolumeSpecName: "config-data") pod "a756fe09-2c73-430d-be27-34caa885311c" (UID: "a756fe09-2c73-430d-be27-34caa885311c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.809719 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a756fe09-2c73-430d-be27-34caa885311c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.880668 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.900705 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-884pr"] Dec 02 07:44:50 crc kubenswrapper[4895]: I1202 07:44:50.934207 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.211994 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9lrrh"] Dec 02 07:44:51 crc kubenswrapper[4895]: W1202 07:44:51.226475 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d3d0dfa_b0dd_4b27_8751_3483a85dc490.slice/crio-c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294 WatchSource:0}: Error finding container c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294: Status 404 returned error can't find the container with id c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294 Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.514288 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-44vd8"] Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.522031 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:51 crc kubenswrapper[4895]: W1202 07:44:51.570504 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1073f594_fe1e_47df_8db9_ad04fc701143.slice/crio-6b78b07825d5aa23ce9f203405743c1752c9f8c4d7542c3caefc2e092fa646db WatchSource:0}: Error finding container 6b78b07825d5aa23ce9f203405743c1752c9f8c4d7542c3caefc2e092fa646db: Status 404 returned error can't find the container with id 6b78b07825d5aa23ce9f203405743c1752c9f8c4d7542c3caefc2e092fa646db Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.594407 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-884pr" event={"ID":"6eb223e5-3856-4849-881f-86683f0e8bc9","Type":"ContainerStarted","Data":"9f87395686eb4293111dd47a55d66e6fd9c827da84446d2ce43c4aa195645589"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.594457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-884pr" event={"ID":"6eb223e5-3856-4849-881f-86683f0e8bc9","Type":"ContainerStarted","Data":"f3e14c6ac80bbb7fe62149eabf5b40522dc86ae5cd5c7e7fbf4a4eef42c35750"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.606247 4895 generic.go:334] "Generic (PLEG): container finished" podID="f8b510ff-ce03-4faf-8c18-7cb50debede2" containerID="483b0959f49984237ed656cacae0ed88fa1041945ccbd41be3462d14221a582c" exitCode=0 Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.606339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" event={"ID":"f8b510ff-ce03-4faf-8c18-7cb50debede2","Type":"ContainerDied","Data":"483b0959f49984237ed656cacae0ed88fa1041945ccbd41be3462d14221a582c"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.606375 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" event={"ID":"f8b510ff-ce03-4faf-8c18-7cb50debede2","Type":"ContainerStarted","Data":"52a48655fc249eb82ec723d5fd1436fb5b095fab165de9c3924ade49fd4331d0"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.624113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9lrrh" event={"ID":"6d3d0dfa-b0dd-4b27-8751-3483a85dc490","Type":"ContainerStarted","Data":"9352b834616a69ecbcd66b6e814ff88f5658fd5608184279b61d4a311c968b79"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.624215 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9lrrh" event={"ID":"6d3d0dfa-b0dd-4b27-8751-3483a85dc490","Type":"ContainerStarted","Data":"c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294"} Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.632460 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-884pr" podStartSLOduration=2.632429956 podStartE2EDuration="2.632429956s" podCreationTimestamp="2025-12-02 07:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:51.62446853 +0000 UTC m=+1302.795328153" watchObservedRunningTime="2025-12-02 07:44:51.632429956 +0000 UTC m=+1302.803289559" Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.653876 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xdfqx"] Dec 02 07:44:51 crc kubenswrapper[4895]: W1202 07:44:51.691010 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ece5f3_3dc5_41db_a8e9_37e6f9054dd8.slice/crio-ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d WatchSource:0}: Error finding container ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d: Status 404 returned error can't find the container with id ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.720259 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9lrrh" podStartSLOduration=2.720230873 podStartE2EDuration="2.720230873s" podCreationTimestamp="2025-12-02 07:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:51.676556867 +0000 UTC m=+1302.847416510" watchObservedRunningTime="2025-12-02 07:44:51.720230873 +0000 UTC m=+1302.891090486" Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.764697 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-prmpk"] Dec 02 07:44:51 crc kubenswrapper[4895]: I1202 07:44:51.784069 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.045102 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.090039 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.091943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.116580 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.138297 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172539 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172608 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172669 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172692 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172712 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdqf\" (UniqueName: \"kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.172802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277683 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277766 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbznt\" (UniqueName: \"kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277897 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277937 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.277970 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0\") pod \"f8b510ff-ce03-4faf-8c18-7cb50debede2\" (UID: \"f8b510ff-ce03-4faf-8c18-7cb50debede2\") " Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278341 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278376 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278412 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278475 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278499 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.278519 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdqf\" (UniqueName: \"kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.280710 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.282727 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.283407 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.287377 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.290282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.313960 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt" (OuterVolumeSpecName: "kube-api-access-xbznt") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "kube-api-access-xbznt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.320435 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdqf\" (UniqueName: \"kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf\") pod \"dnsmasq-dns-785d8bcb8c-rbvw7\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.348920 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.370698 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.377503 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config" (OuterVolumeSpecName: "config") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.382103 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.382144 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.382171 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbznt\" (UniqueName: \"kubernetes.io/projected/f8b510ff-ce03-4faf-8c18-7cb50debede2-kube-api-access-xbznt\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.382181 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.403388 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.426497 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8b510ff-ce03-4faf-8c18-7cb50debede2" (UID: "f8b510ff-ce03-4faf-8c18-7cb50debede2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.433390 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.485559 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.485634 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f8b510ff-ce03-4faf-8c18-7cb50debede2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.652405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerStarted","Data":"074c4f0214159d8576c6c43d0b4e82313b1bda2590e848736b79ba02629dd243"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.654463 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-44vd8" event={"ID":"56723c9c-15bf-4eaa-896c-ea5d07066b27","Type":"ContainerStarted","Data":"c581b50e91e8a4d4727c372eb491f2ecedfe4ca397486c2e558627ca7d830318"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.659885 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-prmpk" event={"ID":"07cedd96-e60e-40e6-9ae7-c29728b9e62c","Type":"ContainerStarted","Data":"3dd237dc0cfeaf0af1d2e587df325baedc49c1cf030db9596fd7eec69934291e"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.671252 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xdfqx" event={"ID":"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8","Type":"ContainerStarted","Data":"ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.678778 4895 generic.go:334] "Generic (PLEG): container finished" podID="1073f594-fe1e-47df-8db9-ad04fc701143" containerID="6f880cff7624282886e9f00f96626147b05d20a8b0ae1e5c525082d5646a9806" exitCode=0 Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.678854 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" event={"ID":"1073f594-fe1e-47df-8db9-ad04fc701143","Type":"ContainerDied","Data":"6f880cff7624282886e9f00f96626147b05d20a8b0ae1e5c525082d5646a9806"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.678873 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" event={"ID":"1073f594-fe1e-47df-8db9-ad04fc701143","Type":"ContainerStarted","Data":"6b78b07825d5aa23ce9f203405743c1752c9f8c4d7542c3caefc2e092fa646db"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.681686 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.681821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-h77pb" event={"ID":"f8b510ff-ce03-4faf-8c18-7cb50debede2","Type":"ContainerDied","Data":"52a48655fc249eb82ec723d5fd1436fb5b095fab165de9c3924ade49fd4331d0"} Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.681892 4895 scope.go:117] "RemoveContainer" containerID="483b0959f49984237ed656cacae0ed88fa1041945ccbd41be3462d14221a582c" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.818068 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.823016 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-h77pb"] Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.971368 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:44:52 crc kubenswrapper[4895]: E1202 07:44:52.994352 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b510ff-ce03-4faf-8c18-7cb50debede2" containerName="init" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.994389 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b510ff-ce03-4faf-8c18-7cb50debede2" containerName="init" Dec 02 07:44:52 crc kubenswrapper[4895]: I1202 07:44:52.994761 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b510ff-ce03-4faf-8c18-7cb50debede2" containerName="init" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:52.995896 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:52.996004 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.003896 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hwlmx" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.004669 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.004832 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110410 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110459 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110530 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110549 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26txx\" (UniqueName: \"kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110586 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.110631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: W1202 07:44:53.162117 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b232bc6_67c7_4add_9057_806e74ef162e.slice/crio-ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed WatchSource:0}: Error finding container ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed: Status 404 returned error can't find the container with id ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.184563 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b510ff-ce03-4faf-8c18-7cb50debede2" path="/var/lib/kubelet/pods/f8b510ff-ce03-4faf-8c18-7cb50debede2/volumes" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.188343 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.213453 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.213681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.213855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.213875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26txx\" (UniqueName: \"kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.214087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.215304 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.215522 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.216558 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.221128 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.221537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.225861 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.231902 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.245716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.249880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.252409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26txx\" (UniqueName: \"kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx\") pod \"glance-default-external-api-0\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.282697 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.287037 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.291415 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.315253 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.398207 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.420626 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.441911 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442029 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442063 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442107 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442150 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.442251 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sztjd\" (UniqueName: \"kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543352 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543428 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjmmc\" (UniqueName: \"kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543449 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543479 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543545 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.543793 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0\") pod \"1073f594-fe1e-47df-8db9-ad04fc701143\" (UID: \"1073f594-fe1e-47df-8db9-ad04fc701143\") " Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544166 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544206 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544241 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544266 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sztjd\" (UniqueName: \"kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544370 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.544654 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.545170 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.545430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.551879 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.558056 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.564355 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc" (OuterVolumeSpecName: "kube-api-access-gjmmc") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "kube-api-access-gjmmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.565163 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.573680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sztjd\" (UniqueName: \"kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.580341 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.602917 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.631112 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config" (OuterVolumeSpecName: "config") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.647084 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.647129 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjmmc\" (UniqueName: \"kubernetes.io/projected/1073f594-fe1e-47df-8db9-ad04fc701143-kube-api-access-gjmmc\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.647142 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.649186 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.678806 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.680254 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1073f594-fe1e-47df-8db9-ad04fc701143" (UID: "1073f594-fe1e-47df-8db9-ad04fc701143"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.700654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.750654 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.750702 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.750721 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1073f594-fe1e-47df-8db9-ad04fc701143-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.770686 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" event={"ID":"5b232bc6-67c7-4add-9057-806e74ef162e","Type":"ContainerStarted","Data":"ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed"} Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.773289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" event={"ID":"1073f594-fe1e-47df-8db9-ad04fc701143","Type":"ContainerDied","Data":"6b78b07825d5aa23ce9f203405743c1752c9f8c4d7542c3caefc2e092fa646db"} Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.773365 4895 scope.go:117] "RemoveContainer" containerID="6f880cff7624282886e9f00f96626147b05d20a8b0ae1e5c525082d5646a9806" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.780710 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-ljwg7" Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.913628 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.943463 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-ljwg7"] Dec 02 07:44:53 crc kubenswrapper[4895]: I1202 07:44:53.973897 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.111811 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.140278 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.239227 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:44:54 crc kubenswrapper[4895]: W1202 07:44:54.291510 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d8ca8ab_4f8e_4136_af19_c7160ee702d9.slice/crio-a02354787d3d93316d679740b7c5287ef2cc7f2a9bf88de3d1ca2d45a8b77dc4 WatchSource:0}: Error finding container a02354787d3d93316d679740b7c5287ef2cc7f2a9bf88de3d1ca2d45a8b77dc4: Status 404 returned error can't find the container with id a02354787d3d93316d679740b7c5287ef2cc7f2a9bf88de3d1ca2d45a8b77dc4 Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.622054 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:44:54 crc kubenswrapper[4895]: W1202 07:44:54.640496 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddceee011_f12c_4dc4_8b4d_c43b09cc84ce.slice/crio-e6cab384060dcdd4ea8d71e292ad1a4275d86a538e35c4a80cb1756fb8c87595 WatchSource:0}: Error finding container e6cab384060dcdd4ea8d71e292ad1a4275d86a538e35c4a80cb1756fb8c87595: Status 404 returned error can't find the container with id e6cab384060dcdd4ea8d71e292ad1a4275d86a538e35c4a80cb1756fb8c87595 Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.804149 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerStarted","Data":"a02354787d3d93316d679740b7c5287ef2cc7f2a9bf88de3d1ca2d45a8b77dc4"} Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.825861 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerStarted","Data":"e6cab384060dcdd4ea8d71e292ad1a4275d86a538e35c4a80cb1756fb8c87595"} Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.833459 4895 generic.go:334] "Generic (PLEG): container finished" podID="5b232bc6-67c7-4add-9057-806e74ef162e" containerID="18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e" exitCode=0 Dec 02 07:44:54 crc kubenswrapper[4895]: I1202 07:44:54.833611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" event={"ID":"5b232bc6-67c7-4add-9057-806e74ef162e","Type":"ContainerDied","Data":"18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e"} Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.175366 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1073f594-fe1e-47df-8db9-ad04fc701143" path="/var/lib/kubelet/pods/1073f594-fe1e-47df-8db9-ad04fc701143/volumes" Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.860722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerStarted","Data":"9dd2de43101bd15a63899f93fa5f4674b1e87771e1333014f5984a35d7e02ca5"} Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.868313 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerStarted","Data":"11f7a71e478f781683343acc1c1a870382e06f398a82941947fee61288ba8056"} Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.871406 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" event={"ID":"5b232bc6-67c7-4add-9057-806e74ef162e","Type":"ContainerStarted","Data":"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b"} Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.872853 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:44:55 crc kubenswrapper[4895]: I1202 07:44:55.912085 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" podStartSLOduration=3.910898528 podStartE2EDuration="3.910898528s" podCreationTimestamp="2025-12-02 07:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:55.900978722 +0000 UTC m=+1307.071838355" watchObservedRunningTime="2025-12-02 07:44:55.910898528 +0000 UTC m=+1307.081758141" Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.899875 4895 generic.go:334] "Generic (PLEG): container finished" podID="6eb223e5-3856-4849-881f-86683f0e8bc9" containerID="9f87395686eb4293111dd47a55d66e6fd9c827da84446d2ce43c4aa195645589" exitCode=0 Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.899925 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-884pr" event={"ID":"6eb223e5-3856-4849-881f-86683f0e8bc9","Type":"ContainerDied","Data":"9f87395686eb4293111dd47a55d66e6fd9c827da84446d2ce43c4aa195645589"} Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.911842 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerStarted","Data":"cc01dba6a12823a786ef85742323fc19955bd6378e5e78dad0c3fa2752740ab5"} Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.911913 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-log" containerID="cri-o://9dd2de43101bd15a63899f93fa5f4674b1e87771e1333014f5984a35d7e02ca5" gracePeriod=30 Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.911986 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-httpd" containerID="cri-o://cc01dba6a12823a786ef85742323fc19955bd6378e5e78dad0c3fa2752740ab5" gracePeriod=30 Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.918321 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerStarted","Data":"d1738591f9bcf10b1e1f9bb09d896c07bc0aa8bffe1f03f3eca74fa8b486f744"} Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.918418 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-log" containerID="cri-o://11f7a71e478f781683343acc1c1a870382e06f398a82941947fee61288ba8056" gracePeriod=30 Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.918482 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-httpd" containerID="cri-o://d1738591f9bcf10b1e1f9bb09d896c07bc0aa8bffe1f03f3eca74fa8b486f744" gracePeriod=30 Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.962915 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.962891701 podStartE2EDuration="4.962891701s" podCreationTimestamp="2025-12-02 07:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:56.950541279 +0000 UTC m=+1308.121400912" watchObservedRunningTime="2025-12-02 07:44:56.962891701 +0000 UTC m=+1308.133751314" Dec 02 07:44:56 crc kubenswrapper[4895]: I1202 07:44:56.974056 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.974032444 podStartE2EDuration="5.974032444s" podCreationTimestamp="2025-12-02 07:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:44:56.973014992 +0000 UTC m=+1308.143874605" watchObservedRunningTime="2025-12-02 07:44:56.974032444 +0000 UTC m=+1308.144892077" Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.938049 4895 generic.go:334] "Generic (PLEG): container finished" podID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerID="cc01dba6a12823a786ef85742323fc19955bd6378e5e78dad0c3fa2752740ab5" exitCode=143 Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.938338 4895 generic.go:334] "Generic (PLEG): container finished" podID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerID="9dd2de43101bd15a63899f93fa5f4674b1e87771e1333014f5984a35d7e02ca5" exitCode=143 Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.938131 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerDied","Data":"cc01dba6a12823a786ef85742323fc19955bd6378e5e78dad0c3fa2752740ab5"} Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.938428 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerDied","Data":"9dd2de43101bd15a63899f93fa5f4674b1e87771e1333014f5984a35d7e02ca5"} Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.942444 4895 generic.go:334] "Generic (PLEG): container finished" podID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerID="d1738591f9bcf10b1e1f9bb09d896c07bc0aa8bffe1f03f3eca74fa8b486f744" exitCode=143 Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.942466 4895 generic.go:334] "Generic (PLEG): container finished" podID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerID="11f7a71e478f781683343acc1c1a870382e06f398a82941947fee61288ba8056" exitCode=143 Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.942521 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerDied","Data":"d1738591f9bcf10b1e1f9bb09d896c07bc0aa8bffe1f03f3eca74fa8b486f744"} Dec 02 07:44:57 crc kubenswrapper[4895]: I1202 07:44:57.942546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerDied","Data":"11f7a71e478f781683343acc1c1a870382e06f398a82941947fee61288ba8056"} Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.146631 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7"] Dec 02 07:45:00 crc kubenswrapper[4895]: E1202 07:45:00.150109 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1073f594-fe1e-47df-8db9-ad04fc701143" containerName="init" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.150140 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1073f594-fe1e-47df-8db9-ad04fc701143" containerName="init" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.150392 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1073f594-fe1e-47df-8db9-ad04fc701143" containerName="init" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.151181 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.153772 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.153915 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.156828 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7"] Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.229346 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.229410 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.229673 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzm8k\" (UniqueName: \"kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.333218 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzm8k\" (UniqueName: \"kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.333532 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.333564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.334773 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.357117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.359975 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzm8k\" (UniqueName: \"kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k\") pod \"collect-profiles-29411025-s8zs7\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:00 crc kubenswrapper[4895]: I1202 07:45:00.483910 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:02 crc kubenswrapper[4895]: I1202 07:45:02.452931 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:45:02 crc kubenswrapper[4895]: I1202 07:45:02.567391 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:45:02 crc kubenswrapper[4895]: I1202 07:45:02.567773 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" containerID="cri-o://9a905c0785603a08aafb95096d190ca7e7d02066b5f4d4bcda08e56d643dd1a7" gracePeriod=10 Dec 02 07:45:03 crc kubenswrapper[4895]: I1202 07:45:03.012352 4895 generic.go:334] "Generic (PLEG): container finished" podID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerID="9a905c0785603a08aafb95096d190ca7e7d02066b5f4d4bcda08e56d643dd1a7" exitCode=0 Dec 02 07:45:03 crc kubenswrapper[4895]: I1202 07:45:03.012440 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" event={"ID":"5977660a-cdf5-4a65-8b8a-bbd944ec5736","Type":"ContainerDied","Data":"9a905c0785603a08aafb95096d190ca7e7d02066b5f4d4bcda08e56d643dd1a7"} Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.485213 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-884pr" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651450 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651540 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmkkp\" (UniqueName: \"kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651726 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651767 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.651817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys\") pod \"6eb223e5-3856-4849-881f-86683f0e8bc9\" (UID: \"6eb223e5-3856-4849-881f-86683f0e8bc9\") " Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.659632 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts" (OuterVolumeSpecName: "scripts") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.660013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.660302 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp" (OuterVolumeSpecName: "kube-api-access-bmkkp") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "kube-api-access-bmkkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.660327 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.682725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data" (OuterVolumeSpecName: "config-data") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.683354 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eb223e5-3856-4849-881f-86683f0e8bc9" (UID: "6eb223e5-3856-4849-881f-86683f0e8bc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754330 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754371 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmkkp\" (UniqueName: \"kubernetes.io/projected/6eb223e5-3856-4849-881f-86683f0e8bc9-kube-api-access-bmkkp\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754384 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754396 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754405 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.754414 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6eb223e5-3856-4849-881f-86683f0e8bc9-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:04 crc kubenswrapper[4895]: I1202 07:45:04.936965 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.034677 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.034671 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dceee011-f12c-4dc4-8b4d-c43b09cc84ce","Type":"ContainerDied","Data":"e6cab384060dcdd4ea8d71e292ad1a4275d86a538e35c4a80cb1756fb8c87595"} Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.034969 4895 scope.go:117] "RemoveContainer" containerID="d1738591f9bcf10b1e1f9bb09d896c07bc0aa8bffe1f03f3eca74fa8b486f744" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.040040 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-884pr" event={"ID":"6eb223e5-3856-4849-881f-86683f0e8bc9","Type":"ContainerDied","Data":"f3e14c6ac80bbb7fe62149eabf5b40522dc86ae5cd5c7e7fbf4a4eef42c35750"} Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.040084 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e14c6ac80bbb7fe62149eabf5b40522dc86ae5cd5c7e7fbf4a4eef42c35750" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.040156 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-884pr" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065006 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065093 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065217 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065292 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065338 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065374 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sztjd\" (UniqueName: \"kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.065406 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data\") pod \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\" (UID: \"dceee011-f12c-4dc4-8b4d-c43b09cc84ce\") " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.068067 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs" (OuterVolumeSpecName: "logs") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.068501 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.070875 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts" (OuterVolumeSpecName: "scripts") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.072885 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.077782 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd" (OuterVolumeSpecName: "kube-api-access-sztjd") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "kube-api-access-sztjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.100910 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.122731 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data" (OuterVolumeSpecName: "config-data") pod "dceee011-f12c-4dc4-8b4d-c43b09cc84ce" (UID: "dceee011-f12c-4dc4-8b4d-c43b09cc84ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.167925 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168000 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168018 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168032 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168043 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168056 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sztjd\" (UniqueName: \"kubernetes.io/projected/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-kube-api-access-sztjd\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.168068 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dceee011-f12c-4dc4-8b4d-c43b09cc84ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.194865 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.270101 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.366157 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.377973 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.391725 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:45:05 crc kubenswrapper[4895]: E1202 07:45:05.392183 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-httpd" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392200 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-httpd" Dec 02 07:45:05 crc kubenswrapper[4895]: E1202 07:45:05.392237 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb223e5-3856-4849-881f-86683f0e8bc9" containerName="keystone-bootstrap" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392244 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb223e5-3856-4849-881f-86683f0e8bc9" containerName="keystone-bootstrap" Dec 02 07:45:05 crc kubenswrapper[4895]: E1202 07:45:05.392255 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-log" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392261 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-log" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392480 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-log" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392489 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" containerName="glance-httpd" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.392515 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb223e5-3856-4849-881f-86683f0e8bc9" containerName="keystone-bootstrap" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.393612 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.396655 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.396893 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.425325 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.473537 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.473615 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.582549 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.582623 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.582903 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.582963 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.583183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b2j5\" (UniqueName: \"kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.583252 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.583474 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.583554 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.630814 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-884pr"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.640788 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-884pr"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b2j5\" (UniqueName: \"kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686503 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686584 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686619 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686654 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686670 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686714 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.686949 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.690305 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.690600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.694533 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.694707 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.697692 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.700083 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.713251 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b2j5\" (UniqueName: \"kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.718704 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xggw9"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.720401 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.723680 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qm9nx" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.723958 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.724114 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.725146 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.725406 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.729152 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xggw9"] Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.741705 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.773624 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895084 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895197 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895267 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895309 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5ql8\" (UniqueName: \"kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.895409 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997435 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5ql8\" (UniqueName: \"kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997585 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:05 crc kubenswrapper[4895]: I1202 07:45:05.997855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.002837 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.003932 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.004473 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.007884 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.030718 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.052116 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5ql8\" (UniqueName: \"kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8\") pod \"keystone-bootstrap-xggw9\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:06 crc kubenswrapper[4895]: I1202 07:45:06.133264 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:07 crc kubenswrapper[4895]: I1202 07:45:07.154433 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb223e5-3856-4849-881f-86683f0e8bc9" path="/var/lib/kubelet/pods/6eb223e5-3856-4849-881f-86683f0e8bc9/volumes" Dec 02 07:45:07 crc kubenswrapper[4895]: I1202 07:45:07.156727 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dceee011-f12c-4dc4-8b4d-c43b09cc84ce" path="/var/lib/kubelet/pods/dceee011-f12c-4dc4-8b4d-c43b09cc84ce/volumes" Dec 02 07:45:11 crc kubenswrapper[4895]: I1202 07:45:11.968613 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 02 07:45:13 crc kubenswrapper[4895]: E1202 07:45:13.793289 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 07:45:13 crc kubenswrapper[4895]: E1202 07:45:13.794199 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf6nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xdfqx_openstack(96ece5f3-3dc5-41db-a8e9-37e6f9054dd8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:45:13 crc kubenswrapper[4895]: E1202 07:45:13.795398 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xdfqx" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.905301 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.932638 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981805 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981881 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981908 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981930 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981954 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.981993 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d927z\" (UniqueName: \"kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.982050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.982080 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.982109 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983217 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983445 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26txx\" (UniqueName: \"kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx\") pod \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\" (UID: \"5d8ca8ab-4f8e-4136-af19-c7160ee702d9\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983436 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs" (OuterVolumeSpecName: "logs") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983574 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983628 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.983685 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config\") pod \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\" (UID: \"5977660a-cdf5-4a65-8b8a-bbd944ec5736\") " Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.984354 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:13 crc kubenswrapper[4895]: I1202 07:45:13.984398 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.040191 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z" (OuterVolumeSpecName: "kube-api-access-d927z") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "kube-api-access-d927z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.046127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx" (OuterVolumeSpecName: "kube-api-access-26txx") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "kube-api-access-26txx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.047381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts" (OuterVolumeSpecName: "scripts") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.050647 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.053917 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.066277 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config" (OuterVolumeSpecName: "config") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.069576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.070788 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.078016 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data" (OuterVolumeSpecName: "config-data") pod "5d8ca8ab-4f8e-4136-af19-c7160ee702d9" (UID: "5d8ca8ab-4f8e-4136-af19-c7160ee702d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087584 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087622 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26txx\" (UniqueName: \"kubernetes.io/projected/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-kube-api-access-26txx\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087638 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087656 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087666 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087675 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087685 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8ca8ab-4f8e-4136-af19-c7160ee702d9-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087724 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.087735 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d927z\" (UniqueName: \"kubernetes.io/projected/5977660a-cdf5-4a65-8b8a-bbd944ec5736-kube-api-access-d927z\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.092868 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.099828 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5977660a-cdf5-4a65-8b8a-bbd944ec5736" (UID: "5977660a-cdf5-4a65-8b8a-bbd944ec5736"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.110715 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.149438 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.151894 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" event={"ID":"5977660a-cdf5-4a65-8b8a-bbd944ec5736","Type":"ContainerDied","Data":"c43e3762f2da55210e8057d2a79ba5a0c4dfeb49809b48005388ce6e3c733014"} Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.155574 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.155846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d8ca8ab-4f8e-4136-af19-c7160ee702d9","Type":"ContainerDied","Data":"a02354787d3d93316d679740b7c5287ef2cc7f2a9bf88de3d1ca2d45a8b77dc4"} Dec 02 07:45:14 crc kubenswrapper[4895]: E1202 07:45:14.157776 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xdfqx" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.189881 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.189912 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.189925 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5977660a-cdf5-4a65-8b8a-bbd944ec5736-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.217333 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.229519 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-hdjr4"] Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.244816 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.256174 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.271224 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:45:14 crc kubenswrapper[4895]: E1202 07:45:14.271699 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="init" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.271717 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="init" Dec 02 07:45:14 crc kubenswrapper[4895]: E1202 07:45:14.271755 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-httpd" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.271762 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-httpd" Dec 02 07:45:14 crc kubenswrapper[4895]: E1202 07:45:14.271784 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.271790 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" Dec 02 07:45:14 crc kubenswrapper[4895]: E1202 07:45:14.271802 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-log" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.271808 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-log" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.272009 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-httpd" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.272028 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.272037 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" containerName="glance-log" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.273153 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.278885 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.279589 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.280128 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393624 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393698 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393757 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvks4\" (UniqueName: \"kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393801 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393852 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393913 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.393971 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.394005 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvks4\" (UniqueName: \"kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495681 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.495822 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.496539 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.496723 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.496891 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.500857 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.501687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.502370 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.507139 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.520051 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvks4\" (UniqueName: \"kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.539832 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " pod="openstack/glance-default-external-api-0" Dec 02 07:45:14 crc kubenswrapper[4895]: I1202 07:45:14.596269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.157246 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" path="/var/lib/kubelet/pods/5977660a-cdf5-4a65-8b8a-bbd944ec5736/volumes" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.158379 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8ca8ab-4f8e-4136-af19-c7160ee702d9" path="/var/lib/kubelet/pods/5d8ca8ab-4f8e-4136-af19-c7160ee702d9/volumes" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.680727 4895 scope.go:117] "RemoveContainer" containerID="11f7a71e478f781683343acc1c1a870382e06f398a82941947fee61288ba8056" Dec 02 07:45:15 crc kubenswrapper[4895]: E1202 07:45:15.687177 4895 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 07:45:15 crc kubenswrapper[4895]: E1202 07:45:15.687958 4895 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-24hg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-44vd8_openstack(56723c9c-15bf-4eaa-896c-ea5d07066b27): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 07:45:15 crc kubenswrapper[4895]: E1202 07:45:15.689263 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-44vd8" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.736414 4895 scope.go:117] "RemoveContainer" containerID="9a905c0785603a08aafb95096d190ca7e7d02066b5f4d4bcda08e56d643dd1a7" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.861847 4895 scope.go:117] "RemoveContainer" containerID="6b964e030987aeb72ab4f97c379e8c654950a5733d9ac70a637495e2e188905b" Dec 02 07:45:15 crc kubenswrapper[4895]: I1202 07:45:15.977037 4895 scope.go:117] "RemoveContainer" containerID="cc01dba6a12823a786ef85742323fc19955bd6378e5e78dad0c3fa2752740ab5" Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.037014 4895 scope.go:117] "RemoveContainer" containerID="9dd2de43101bd15a63899f93fa5f4674b1e87771e1333014f5984a35d7e02ca5" Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.228417 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerStarted","Data":"8bfb9fa755b1d3e73df5a493d8f23979f6da47a21ed4d8ae8683e54eed3de8a6"} Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.239354 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-prmpk" event={"ID":"07cedd96-e60e-40e6-9ae7-c29728b9e62c","Type":"ContainerStarted","Data":"a6b05ad2818111c94be943845c6204ff5c39fb35c07db9ae40f5c8e318b1f644"} Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.253766 4895 generic.go:334] "Generic (PLEG): container finished" podID="6d3d0dfa-b0dd-4b27-8751-3483a85dc490" containerID="9352b834616a69ecbcd66b6e814ff88f5658fd5608184279b61d4a311c968b79" exitCode=0 Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.255035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9lrrh" event={"ID":"6d3d0dfa-b0dd-4b27-8751-3483a85dc490","Type":"ContainerDied","Data":"9352b834616a69ecbcd66b6e814ff88f5658fd5608184279b61d4a311c968b79"} Dec 02 07:45:16 crc kubenswrapper[4895]: E1202 07:45:16.255803 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-44vd8" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.255875 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7"] Dec 02 07:45:16 crc kubenswrapper[4895]: W1202 07:45:16.260311 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a909a9_9821_48da_8599_3162f92f4202.slice/crio-004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e WatchSource:0}: Error finding container 004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e: Status 404 returned error can't find the container with id 004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.284291 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-prmpk" podStartSLOduration=4.233677564 podStartE2EDuration="26.284270285s" podCreationTimestamp="2025-12-02 07:44:50 +0000 UTC" firstStartedPulling="2025-12-02 07:44:51.754197611 +0000 UTC m=+1302.925057224" lastFinishedPulling="2025-12-02 07:45:13.804790332 +0000 UTC m=+1324.975649945" observedRunningTime="2025-12-02 07:45:16.277804335 +0000 UTC m=+1327.448663948" watchObservedRunningTime="2025-12-02 07:45:16.284270285 +0000 UTC m=+1327.455129898" Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.367416 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xggw9"] Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.402638 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.462491 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:45:16 crc kubenswrapper[4895]: I1202 07:45:16.970074 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-hdjr4" podUID="5977660a-cdf5-4a65-8b8a-bbd944ec5736" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.160406 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.268044 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerStarted","Data":"e60d35c8b56f6319e4e9e6ca44287e73a7f3f042c2e5330d10f68d04d703572e"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.268100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerStarted","Data":"2979ac525ad48c4421c35ec93026a9445beae6a73f9d8b0c5640d3d7542b2801"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.273826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xggw9" event={"ID":"e7d15bc9-7912-4eab-9c22-23630caecbb4","Type":"ContainerStarted","Data":"020f73bbd49e945d5c90d4d98cfbb78206c4dba84b8249b508b6c2f0f41eb7e3"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.273928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xggw9" event={"ID":"e7d15bc9-7912-4eab-9c22-23630caecbb4","Type":"ContainerStarted","Data":"a1fc0ddd0c9a7d6bc2525042eade3078ceaf98532eb2185ba3b719c67000dfd9"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.275858 4895 generic.go:334] "Generic (PLEG): container finished" podID="d5a909a9-9821-48da-8599-3162f92f4202" containerID="84bf831f6fe6915ec319cbaa86b8f1d2e40ea646bbe4a7dbb8934b9a4a1ff83e" exitCode=0 Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.275896 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" event={"ID":"d5a909a9-9821-48da-8599-3162f92f4202","Type":"ContainerDied","Data":"84bf831f6fe6915ec319cbaa86b8f1d2e40ea646bbe4a7dbb8934b9a4a1ff83e"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.275946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" event={"ID":"d5a909a9-9821-48da-8599-3162f92f4202","Type":"ContainerStarted","Data":"004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e"} Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.312175 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xggw9" podStartSLOduration=12.312152794 podStartE2EDuration="12.312152794s" podCreationTimestamp="2025-12-02 07:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:17.308540772 +0000 UTC m=+1328.479400395" watchObservedRunningTime="2025-12-02 07:45:17.312152794 +0000 UTC m=+1328.483012407" Dec 02 07:45:17 crc kubenswrapper[4895]: W1202 07:45:17.592195 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3768c25_d6e0_4d93_a8c9_6b869977f267.slice/crio-7b65a580e0a793a2de615a15be44a27168b9ff98b0fab2b9b479c8dde0d62a0a WatchSource:0}: Error finding container 7b65a580e0a793a2de615a15be44a27168b9ff98b0fab2b9b479c8dde0d62a0a: Status 404 returned error can't find the container with id 7b65a580e0a793a2de615a15be44a27168b9ff98b0fab2b9b479c8dde0d62a0a Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.691729 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.873614 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdk67\" (UniqueName: \"kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67\") pod \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.873843 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle\") pod \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.873894 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config\") pod \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\" (UID: \"6d3d0dfa-b0dd-4b27-8751-3483a85dc490\") " Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.878097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67" (OuterVolumeSpecName: "kube-api-access-bdk67") pod "6d3d0dfa-b0dd-4b27-8751-3483a85dc490" (UID: "6d3d0dfa-b0dd-4b27-8751-3483a85dc490"). InnerVolumeSpecName "kube-api-access-bdk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.909177 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d3d0dfa-b0dd-4b27-8751-3483a85dc490" (UID: "6d3d0dfa-b0dd-4b27-8751-3483a85dc490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.911276 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config" (OuterVolumeSpecName: "config") pod "6d3d0dfa-b0dd-4b27-8751-3483a85dc490" (UID: "6d3d0dfa-b0dd-4b27-8751-3483a85dc490"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.975831 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.975875 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:17 crc kubenswrapper[4895]: I1202 07:45:17.975885 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdk67\" (UniqueName: \"kubernetes.io/projected/6d3d0dfa-b0dd-4b27-8751-3483a85dc490-kube-api-access-bdk67\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.288926 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerStarted","Data":"7b65a580e0a793a2de615a15be44a27168b9ff98b0fab2b9b479c8dde0d62a0a"} Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.291460 4895 generic.go:334] "Generic (PLEG): container finished" podID="07cedd96-e60e-40e6-9ae7-c29728b9e62c" containerID="a6b05ad2818111c94be943845c6204ff5c39fb35c07db9ae40f5c8e318b1f644" exitCode=0 Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.291916 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-prmpk" event={"ID":"07cedd96-e60e-40e6-9ae7-c29728b9e62c","Type":"ContainerDied","Data":"a6b05ad2818111c94be943845c6204ff5c39fb35c07db9ae40f5c8e318b1f644"} Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.294574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9lrrh" event={"ID":"6d3d0dfa-b0dd-4b27-8751-3483a85dc490","Type":"ContainerDied","Data":"c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294"} Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.294605 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b38afad3f78d18b686fd37d79fa09d572814c13ccc1f8daeaef07e360d6294" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.294666 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9lrrh" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.298006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerStarted","Data":"ef44fd0d43646d18ce0e1b8c6bdb73245931a867ce992bd072016aac43c5116a"} Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.636383 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:18 crc kubenswrapper[4895]: E1202 07:45:18.637216 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3d0dfa-b0dd-4b27-8751-3483a85dc490" containerName="neutron-db-sync" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.637243 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3d0dfa-b0dd-4b27-8751-3483a85dc490" containerName="neutron-db-sync" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.637439 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3d0dfa-b0dd-4b27-8751-3483a85dc490" containerName="neutron-db-sync" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.640295 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.681755 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.778107 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.785190 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.788004 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.788163 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xr2dd" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.788283 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.788840 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.794139 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xxk\" (UniqueName: \"kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816633 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816723 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816794 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.816833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.873052 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920622 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8jbh\" (UniqueName: \"kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920696 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8xxk\" (UniqueName: \"kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920751 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920778 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920803 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920856 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.920928 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.922682 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.922691 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.922810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.923165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.923364 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.971075 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8xxk\" (UniqueName: \"kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk\") pod \"dnsmasq-dns-55f844cf75-nk9m9\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:18 crc kubenswrapper[4895]: I1202 07:45:18.989558 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.029971 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume\") pod \"d5a909a9-9821-48da-8599-3162f92f4202\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.030616 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume\") pod \"d5a909a9-9821-48da-8599-3162f92f4202\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.030760 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzm8k\" (UniqueName: \"kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k\") pod \"d5a909a9-9821-48da-8599-3162f92f4202\" (UID: \"d5a909a9-9821-48da-8599-3162f92f4202\") " Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.031095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.031364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.031385 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8jbh\" (UniqueName: \"kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.031525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.031555 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.032675 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5a909a9-9821-48da-8599-3162f92f4202" (UID: "d5a909a9-9821-48da-8599-3162f92f4202"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.058600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.058817 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5a909a9-9821-48da-8599-3162f92f4202" (UID: "d5a909a9-9821-48da-8599-3162f92f4202"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.059003 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.075759 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.076258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.077999 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k" (OuterVolumeSpecName: "kube-api-access-qzm8k") pod "d5a909a9-9821-48da-8599-3162f92f4202" (UID: "d5a909a9-9821-48da-8599-3162f92f4202"). InnerVolumeSpecName "kube-api-access-qzm8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.092099 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8jbh\" (UniqueName: \"kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh\") pod \"neutron-558857dd7b-r29g8\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.130541 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.133843 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5a909a9-9821-48da-8599-3162f92f4202-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.133881 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzm8k\" (UniqueName: \"kubernetes.io/projected/d5a909a9-9821-48da-8599-3162f92f4202-kube-api-access-qzm8k\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.133893 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5a909a9-9821-48da-8599-3162f92f4202-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.375918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerStarted","Data":"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67"} Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.395277 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" event={"ID":"d5a909a9-9821-48da-8599-3162f92f4202","Type":"ContainerDied","Data":"004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e"} Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.395327 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004247c0c734c3013e3aa796bde7cb8ced98964745339b5d34961711c44ca04e" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.395328 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.416486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerStarted","Data":"036f9ade09b808d7661bab5c17d24da5c8e38d6318f235d5349c9d48e757bb70"} Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.460783 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.460727152 podStartE2EDuration="14.460727152s" podCreationTimestamp="2025-12-02 07:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:19.451377474 +0000 UTC m=+1330.622237087" watchObservedRunningTime="2025-12-02 07:45:19.460727152 +0000 UTC m=+1330.631586775" Dec 02 07:45:19 crc kubenswrapper[4895]: I1202 07:45:19.908393 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.095695 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-prmpk" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.220668 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.281829 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m5ls\" (UniqueName: \"kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls\") pod \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.282079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs\") pod \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.282112 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle\") pod \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.282146 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data\") pod \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.282189 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts\") pod \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\" (UID: \"07cedd96-e60e-40e6-9ae7-c29728b9e62c\") " Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.284681 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs" (OuterVolumeSpecName: "logs") pod "07cedd96-e60e-40e6-9ae7-c29728b9e62c" (UID: "07cedd96-e60e-40e6-9ae7-c29728b9e62c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.290356 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls" (OuterVolumeSpecName: "kube-api-access-8m5ls") pod "07cedd96-e60e-40e6-9ae7-c29728b9e62c" (UID: "07cedd96-e60e-40e6-9ae7-c29728b9e62c"). InnerVolumeSpecName "kube-api-access-8m5ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.300840 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts" (OuterVolumeSpecName: "scripts") pod "07cedd96-e60e-40e6-9ae7-c29728b9e62c" (UID: "07cedd96-e60e-40e6-9ae7-c29728b9e62c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.324923 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data" (OuterVolumeSpecName: "config-data") pod "07cedd96-e60e-40e6-9ae7-c29728b9e62c" (UID: "07cedd96-e60e-40e6-9ae7-c29728b9e62c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.335885 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07cedd96-e60e-40e6-9ae7-c29728b9e62c" (UID: "07cedd96-e60e-40e6-9ae7-c29728b9e62c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.384764 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m5ls\" (UniqueName: \"kubernetes.io/projected/07cedd96-e60e-40e6-9ae7-c29728b9e62c-kube-api-access-8m5ls\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.384813 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cedd96-e60e-40e6-9ae7-c29728b9e62c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.384828 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.384839 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.384849 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cedd96-e60e-40e6-9ae7-c29728b9e62c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.428601 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerStarted","Data":"19d1141a1e44741a385d60aba84cf1c0e30d559ce9f211ab11e2bf6235c4b2a8"} Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.436619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerStarted","Data":"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e"} Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.441327 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-prmpk" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.441386 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-prmpk" event={"ID":"07cedd96-e60e-40e6-9ae7-c29728b9e62c","Type":"ContainerDied","Data":"3dd237dc0cfeaf0af1d2e587df325baedc49c1cf030db9596fd7eec69934291e"} Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.441452 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd237dc0cfeaf0af1d2e587df325baedc49c1cf030db9596fd7eec69934291e" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.457501 4895 generic.go:334] "Generic (PLEG): container finished" podID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerID="d7ddf51525ffcff16a99c0e84742f8bee980e8ae5d18775d07c033636237fd92" exitCode=0 Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.459109 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" event={"ID":"540a6298-e74a-48b3-aa5d-93d03c6871de","Type":"ContainerDied","Data":"d7ddf51525ffcff16a99c0e84742f8bee980e8ae5d18775d07c033636237fd92"} Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.459161 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" event={"ID":"540a6298-e74a-48b3-aa5d-93d03c6871de","Type":"ContainerStarted","Data":"f5f472ae41ae3434b5ddbe57cbd0ad299ac197f6ccf9f7eae48fdc85a2ba886e"} Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.481260 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.481230064 podStartE2EDuration="6.481230064s" podCreationTimestamp="2025-12-02 07:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:20.474051382 +0000 UTC m=+1331.644911005" watchObservedRunningTime="2025-12-02 07:45:20.481230064 +0000 UTC m=+1331.652089677" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.522650 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:45:20 crc kubenswrapper[4895]: E1202 07:45:20.523162 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a909a9-9821-48da-8599-3162f92f4202" containerName="collect-profiles" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.523184 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a909a9-9821-48da-8599-3162f92f4202" containerName="collect-profiles" Dec 02 07:45:20 crc kubenswrapper[4895]: E1202 07:45:20.523213 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cedd96-e60e-40e6-9ae7-c29728b9e62c" containerName="placement-db-sync" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.523223 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cedd96-e60e-40e6-9ae7-c29728b9e62c" containerName="placement-db-sync" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.523411 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cedd96-e60e-40e6-9ae7-c29728b9e62c" containerName="placement-db-sync" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.523440 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a909a9-9821-48da-8599-3162f92f4202" containerName="collect-profiles" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.524508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.530296 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.530782 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.531122 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.531328 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-86fdd" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.532080 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.552897 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698347 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698421 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698466 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfld8\" (UniqueName: \"kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698510 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698537 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.698594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800037 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800126 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfld8\" (UniqueName: \"kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800168 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800191 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800208 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.800231 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.801783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.806375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.808850 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.809426 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.813408 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.813878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.823433 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfld8\" (UniqueName: \"kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8\") pod \"placement-54957dcd96-7sx87\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:20 crc kubenswrapper[4895]: I1202 07:45:20.858616 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.393494 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:45:21 crc kubenswrapper[4895]: W1202 07:45:21.400519 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68bddf66_0b9f_4bc8_916b_aa0abfbf13c3.slice/crio-1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444 WatchSource:0}: Error finding container 1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444: Status 404 returned error can't find the container with id 1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444 Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.479442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerStarted","Data":"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75"} Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.479832 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerStarted","Data":"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06"} Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.480755 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.482696 4895 generic.go:334] "Generic (PLEG): container finished" podID="e7d15bc9-7912-4eab-9c22-23630caecbb4" containerID="020f73bbd49e945d5c90d4d98cfbb78206c4dba84b8249b508b6c2f0f41eb7e3" exitCode=0 Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.482847 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xggw9" event={"ID":"e7d15bc9-7912-4eab-9c22-23630caecbb4","Type":"ContainerDied","Data":"020f73bbd49e945d5c90d4d98cfbb78206c4dba84b8249b508b6c2f0f41eb7e3"} Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.493135 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" event={"ID":"540a6298-e74a-48b3-aa5d-93d03c6871de","Type":"ContainerStarted","Data":"940d3a91fbb2068793a0307956bc5c1a1b80502bb0c0f171208f9cca2368ccec"} Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.497333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerStarted","Data":"1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444"} Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.528032 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-558857dd7b-r29g8" podStartSLOduration=3.528002425 podStartE2EDuration="3.528002425s" podCreationTimestamp="2025-12-02 07:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:21.499807045 +0000 UTC m=+1332.670666658" watchObservedRunningTime="2025-12-02 07:45:21.528002425 +0000 UTC m=+1332.698862058" Dec 02 07:45:21 crc kubenswrapper[4895]: I1202 07:45:21.582590 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" podStartSLOduration=3.582570128 podStartE2EDuration="3.582570128s" podCreationTimestamp="2025-12-02 07:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:21.58039749 +0000 UTC m=+1332.751257113" watchObservedRunningTime="2025-12-02 07:45:21.582570128 +0000 UTC m=+1332.753429751" Dec 02 07:45:22 crc kubenswrapper[4895]: I1202 07:45:22.525432 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerStarted","Data":"4bce6feae18b88a0dade864ed7f4db319704698221a61e3defcf26b5f9e0a73e"} Dec 02 07:45:22 crc kubenswrapper[4895]: I1202 07:45:22.526090 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.349464 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.353499 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.366431 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.366601 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.367252 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490763 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490888 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490938 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.490989 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.544233 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerStarted","Data":"79507980e01b07ea773d434932da83cc407f386cc2f4f05c605e4f8341d7bef2"} Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.544314 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.544337 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.575250 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54957dcd96-7sx87" podStartSLOduration=3.575231299 podStartE2EDuration="3.575231299s" podCreationTimestamp="2025-12-02 07:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:23.568496481 +0000 UTC m=+1334.739356094" watchObservedRunningTime="2025-12-02 07:45:23.575231299 +0000 UTC m=+1334.746090912" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593026 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593103 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593138 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593206 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.593603 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.604704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.605712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.606380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.611256 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.611810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.623274 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.625682 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5\") pod \"neutron-ddf8948cc-h2bbh\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:23 crc kubenswrapper[4895]: I1202 07:45:23.679630 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.525458 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.558418 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xggw9" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.558823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xggw9" event={"ID":"e7d15bc9-7912-4eab-9c22-23630caecbb4","Type":"ContainerDied","Data":"a1fc0ddd0c9a7d6bc2525042eade3078ceaf98532eb2185ba3b719c67000dfd9"} Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.558857 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1fc0ddd0c9a7d6bc2525042eade3078ceaf98532eb2185ba3b719c67000dfd9" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.596823 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.596904 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.614540 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.614607 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.614657 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5ql8\" (UniqueName: \"kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.614802 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.614903 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.615067 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data\") pod \"e7d15bc9-7912-4eab-9c22-23630caecbb4\" (UID: \"e7d15bc9-7912-4eab-9c22-23630caecbb4\") " Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.623201 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.623795 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8" (OuterVolumeSpecName: "kube-api-access-m5ql8") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "kube-api-access-m5ql8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.624979 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.628273 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts" (OuterVolumeSpecName: "scripts") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.649644 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.653558 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.660028 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data" (OuterVolumeSpecName: "config-data") pod "e7d15bc9-7912-4eab-9c22-23630caecbb4" (UID: "e7d15bc9-7912-4eab-9c22-23630caecbb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.673052 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.722513 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.722862 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.722979 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5ql8\" (UniqueName: \"kubernetes.io/projected/e7d15bc9-7912-4eab-9c22-23630caecbb4-kube-api-access-m5ql8\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.723066 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.723143 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:24 crc kubenswrapper[4895]: I1202 07:45:24.723239 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d15bc9-7912-4eab-9c22-23630caecbb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.571731 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.572328 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.635784 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:45:25 crc kubenswrapper[4895]: E1202 07:45:25.636217 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d15bc9-7912-4eab-9c22-23630caecbb4" containerName="keystone-bootstrap" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.636236 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d15bc9-7912-4eab-9c22-23630caecbb4" containerName="keystone-bootstrap" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.636443 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d15bc9-7912-4eab-9c22-23630caecbb4" containerName="keystone-bootstrap" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.637118 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.649820 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.650060 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.650178 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.650203 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.650290 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.650969 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qm9nx" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.653373 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747026 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747289 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747710 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747792 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.747945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.748013 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxf2\" (UniqueName: \"kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.775498 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.775550 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.824039 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.833873 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849490 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849595 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849722 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849795 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.849817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxf2\" (UniqueName: \"kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.860558 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.862216 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.862620 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.868032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.869989 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.871425 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.874282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxf2\" (UniqueName: \"kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.882537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle\") pod \"keystone-56dbdc9bc-kgkw2\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:25 crc kubenswrapper[4895]: I1202 07:45:25.995571 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:26 crc kubenswrapper[4895]: I1202 07:45:26.580460 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:26 crc kubenswrapper[4895]: I1202 07:45:26.580922 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:27 crc kubenswrapper[4895]: I1202 07:45:27.847418 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 07:45:27 crc kubenswrapper[4895]: I1202 07:45:27.848072 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 07:45:27 crc kubenswrapper[4895]: I1202 07:45:27.856716 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 07:45:28 crc kubenswrapper[4895]: I1202 07:45:28.750272 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:45:28 crc kubenswrapper[4895]: W1202 07:45:28.785235 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5ec753_410a_4d4b_8071_ce60970ba4df.slice/crio-e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46 WatchSource:0}: Error finding container e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46: Status 404 returned error can't find the container with id e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46 Dec 02 07:45:28 crc kubenswrapper[4895]: I1202 07:45:28.885791 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:28.995941 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.101971 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.102633 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="dnsmasq-dns" containerID="cri-o://8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b" gracePeriod=10 Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.503423 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.503537 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.696016 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.760090 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xdfqx" event={"ID":"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8","Type":"ContainerStarted","Data":"93b2d419eb18cfab0debf5c9a11d016c6acafb519c1028e43dceebb955ec1a84"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.766924 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56dbdc9bc-kgkw2" event={"ID":"247c892c-e00a-474e-8022-73bd1b2249f3","Type":"ContainerStarted","Data":"fe38dca9f6627e9e19b2be20b54cb47cb1aee5e491dae454c261bcbe08243752"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.767012 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56dbdc9bc-kgkw2" event={"ID":"247c892c-e00a-474e-8022-73bd1b2249f3","Type":"ContainerStarted","Data":"278224f012e51a3c3f0f8cebf3193397ed3b386576713370432fad71826ecd5f"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.768124 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.791126 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerStarted","Data":"44ae8909515453d51c81fc2eab9723fc18e5cf8dc79ec16427db8d716e2d75dd"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.791189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerStarted","Data":"e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792540 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792603 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjdqf\" (UniqueName: \"kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792669 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792801 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792824 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.792853 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0\") pod \"5b232bc6-67c7-4add-9057-806e74ef162e\" (UID: \"5b232bc6-67c7-4add-9057-806e74ef162e\") " Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.803316 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xdfqx" podStartSLOduration=3.145528238 podStartE2EDuration="39.791724706s" podCreationTimestamp="2025-12-02 07:44:50 +0000 UTC" firstStartedPulling="2025-12-02 07:44:51.726461436 +0000 UTC m=+1302.897321049" lastFinishedPulling="2025-12-02 07:45:28.372657904 +0000 UTC m=+1339.543517517" observedRunningTime="2025-12-02 07:45:29.785561876 +0000 UTC m=+1340.956421509" watchObservedRunningTime="2025-12-02 07:45:29.791724706 +0000 UTC m=+1340.962584319" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.842544 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56dbdc9bc-kgkw2" podStartSLOduration=4.842519943 podStartE2EDuration="4.842519943s" podCreationTimestamp="2025-12-02 07:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:29.828659786 +0000 UTC m=+1340.999519399" watchObservedRunningTime="2025-12-02 07:45:29.842519943 +0000 UTC m=+1341.013379556" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.846043 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerStarted","Data":"b4214addbfaabc09997052827a3fdefb258c86367a4da84d2c8edacf38b19b42"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.855367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf" (OuterVolumeSpecName: "kube-api-access-bjdqf") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "kube-api-access-bjdqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.881563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-44vd8" event={"ID":"56723c9c-15bf-4eaa-896c-ea5d07066b27","Type":"ContainerStarted","Data":"b0e5d2da099fb073b8b5423e92932a90a8ea91c926fc7b91aa4ebeabcd5e1f3b"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.893248 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config" (OuterVolumeSpecName: "config") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.896762 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjdqf\" (UniqueName: \"kubernetes.io/projected/5b232bc6-67c7-4add-9057-806e74ef162e-kube-api-access-bjdqf\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.896884 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.901516 4895 generic.go:334] "Generic (PLEG): container finished" podID="5b232bc6-67c7-4add-9057-806e74ef162e" containerID="8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b" exitCode=0 Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.902223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" event={"ID":"5b232bc6-67c7-4add-9057-806e74ef162e","Type":"ContainerDied","Data":"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.902346 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" event={"ID":"5b232bc6-67c7-4add-9057-806e74ef162e","Type":"ContainerDied","Data":"ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed"} Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.902447 4895 scope.go:117] "RemoveContainer" containerID="8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.902839 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rbvw7" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.919051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 07:45:29 crc kubenswrapper[4895]: I1202 07:45:29.928086 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-44vd8" podStartSLOduration=4.15303256 podStartE2EDuration="40.928065602s" podCreationTimestamp="2025-12-02 07:44:49 +0000 UTC" firstStartedPulling="2025-12-02 07:44:51.591236025 +0000 UTC m=+1302.762095638" lastFinishedPulling="2025-12-02 07:45:28.366269067 +0000 UTC m=+1339.537128680" observedRunningTime="2025-12-02 07:45:29.906250078 +0000 UTC m=+1341.077109711" watchObservedRunningTime="2025-12-02 07:45:29.928065602 +0000 UTC m=+1341.098925205" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.002957 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.004640 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.024539 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.033049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.062369 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b232bc6-67c7-4add-9057-806e74ef162e" (UID: "5b232bc6-67c7-4add-9057-806e74ef162e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.062480 4895 scope.go:117] "RemoveContainer" containerID="18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.127337 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.127384 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.127402 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b232bc6-67c7-4add-9057-806e74ef162e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.153709 4895 scope.go:117] "RemoveContainer" containerID="8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b" Dec 02 07:45:30 crc kubenswrapper[4895]: E1202 07:45:30.154402 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b\": container with ID starting with 8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b not found: ID does not exist" containerID="8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.154459 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b"} err="failed to get container status \"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b\": rpc error: code = NotFound desc = could not find container \"8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b\": container with ID starting with 8d4392f2e8bcb7eff491b6e3fcb4e08ec284a1dff98eb11958465bf1f2f1121b not found: ID does not exist" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.154492 4895 scope.go:117] "RemoveContainer" containerID="18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e" Dec 02 07:45:30 crc kubenswrapper[4895]: E1202 07:45:30.155044 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e\": container with ID starting with 18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e not found: ID does not exist" containerID="18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.155066 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e"} err="failed to get container status \"18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e\": rpc error: code = NotFound desc = could not find container \"18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e\": container with ID starting with 18a2d89aa40cfde56edc4ff7f64e1ab696ea8e939680ba7f979507aea578cc0e not found: ID does not exist" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.291068 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:45:30 crc kubenswrapper[4895]: E1202 07:45:30.309389 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b232bc6_67c7_4add_9057_806e74ef162e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b232bc6_67c7_4add_9057_806e74ef162e.slice/crio-ad473fa35cda87ac095d267204f35bacb11843bafcf3b8adfabe8bdc2145efed\": RecentStats: unable to find data in memory cache]" Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.311535 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rbvw7"] Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.918360 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerStarted","Data":"949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324"} Dec 02 07:45:30 crc kubenswrapper[4895]: I1202 07:45:30.919166 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:31 crc kubenswrapper[4895]: I1202 07:45:31.164072 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" path="/var/lib/kubelet/pods/5b232bc6-67c7-4add-9057-806e74ef162e/volumes" Dec 02 07:45:32 crc kubenswrapper[4895]: I1202 07:45:32.948505 4895 generic.go:334] "Generic (PLEG): container finished" podID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" containerID="93b2d419eb18cfab0debf5c9a11d016c6acafb519c1028e43dceebb955ec1a84" exitCode=0 Dec 02 07:45:32 crc kubenswrapper[4895]: I1202 07:45:32.948589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xdfqx" event={"ID":"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8","Type":"ContainerDied","Data":"93b2d419eb18cfab0debf5c9a11d016c6acafb519c1028e43dceebb955ec1a84"} Dec 02 07:45:32 crc kubenswrapper[4895]: I1202 07:45:32.968840 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ddf8948cc-h2bbh" podStartSLOduration=9.968816583 podStartE2EDuration="9.968816583s" podCreationTimestamp="2025-12-02 07:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:30.944853217 +0000 UTC m=+1342.115712830" watchObservedRunningTime="2025-12-02 07:45:32.968816583 +0000 UTC m=+1344.139676186" Dec 02 07:45:34 crc kubenswrapper[4895]: I1202 07:45:34.979832 4895 generic.go:334] "Generic (PLEG): container finished" podID="56723c9c-15bf-4eaa-896c-ea5d07066b27" containerID="b0e5d2da099fb073b8b5423e92932a90a8ea91c926fc7b91aa4ebeabcd5e1f3b" exitCode=0 Dec 02 07:45:34 crc kubenswrapper[4895]: I1202 07:45:34.979940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-44vd8" event={"ID":"56723c9c-15bf-4eaa-896c-ea5d07066b27","Type":"ContainerDied","Data":"b0e5d2da099fb073b8b5423e92932a90a8ea91c926fc7b91aa4ebeabcd5e1f3b"} Dec 02 07:45:35 crc kubenswrapper[4895]: I1202 07:45:35.473479 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:45:35 crc kubenswrapper[4895]: I1202 07:45:35.474097 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.479038 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.529868 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle\") pod \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.530175 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf6nh\" (UniqueName: \"kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh\") pod \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.530247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data\") pod \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\" (UID: \"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8\") " Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.539531 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh" (OuterVolumeSpecName: "kube-api-access-vf6nh") pod "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" (UID: "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8"). InnerVolumeSpecName "kube-api-access-vf6nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.539966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" (UID: "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.575056 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" (UID: "96ece5f3-3dc5-41db-a8e9-37e6f9054dd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.632955 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.633003 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf6nh\" (UniqueName: \"kubernetes.io/projected/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-kube-api-access-vf6nh\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:37 crc kubenswrapper[4895]: I1202 07:45:37.633014 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.018528 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xdfqx" event={"ID":"96ece5f3-3dc5-41db-a8e9-37e6f9054dd8","Type":"ContainerDied","Data":"ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d"} Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.018579 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca10957317bbfad55352c85d5156932a78c708152777c151f3cbac0595b00e3d" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.018634 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xdfqx" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.855028 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:45:38 crc kubenswrapper[4895]: E1202 07:45:38.857379 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="dnsmasq-dns" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.857400 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="dnsmasq-dns" Dec 02 07:45:38 crc kubenswrapper[4895]: E1202 07:45:38.857597 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" containerName="barbican-db-sync" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.857612 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" containerName="barbican-db-sync" Dec 02 07:45:38 crc kubenswrapper[4895]: E1202 07:45:38.857640 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="init" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.857647 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="init" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.858343 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b232bc6-67c7-4add-9057-806e74ef162e" containerName="dnsmasq-dns" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.858370 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" containerName="barbican-db-sync" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.863030 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.867708 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.868345 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8wlh5" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.868622 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.870852 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.870937 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.870960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.871008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.871030 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7g5b\" (UniqueName: \"kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.890712 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.946582 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.948827 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.954503 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.974122 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975136 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975204 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975273 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975302 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975330 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975369 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbgq\" (UniqueName: \"kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975396 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975417 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.975457 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7g5b\" (UniqueName: \"kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.976834 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:38 crc kubenswrapper[4895]: I1202 07:45:38.983515 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.001436 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.018907 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7g5b\" (UniqueName: \"kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.019146 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.021216 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.022098 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom\") pod \"barbican-worker-b969f4967-hmqp8\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.039452 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.077032 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.077163 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.077216 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbgq\" (UniqueName: \"kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.077244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.077265 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.082417 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.082716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.113089 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.122550 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180116 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180167 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhsw\" (UniqueName: \"kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180322 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180340 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.180361 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.181171 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbgq\" (UniqueName: \"kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq\") pod \"barbican-keystone-listener-64685599d6-tgrm9\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.195513 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.198484 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.200107 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.206173 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.213363 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.274385 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.281785 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.281857 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhsw\" (UniqueName: \"kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.281883 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwmn\" (UniqueName: \"kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.281935 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.281998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282026 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282044 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282100 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.282217 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.291078 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.291208 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.292492 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.293057 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.296963 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.320095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhsw\" (UniqueName: \"kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw\") pod \"dnsmasq-dns-85ff748b95-nmt5c\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.385907 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwmn\" (UniqueName: \"kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.385975 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.386061 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.386134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.386197 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.390977 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.397432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.405007 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.408071 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.433617 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwmn\" (UniqueName: \"kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn\") pod \"barbican-api-89bf75f54-8mw6n\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.558094 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:39 crc kubenswrapper[4895]: I1202 07:45:39.581443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.059719 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-44vd8" event={"ID":"56723c9c-15bf-4eaa-896c-ea5d07066b27","Type":"ContainerDied","Data":"c581b50e91e8a4d4727c372eb491f2ecedfe4ca397486c2e558627ca7d830318"} Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.060403 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c581b50e91e8a4d4727c372eb491f2ecedfe4ca397486c2e558627ca7d830318" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.122156 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-44vd8" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.203572 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.203628 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.203704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.204007 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.204056 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24hg5\" (UniqueName: \"kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.204137 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts\") pod \"56723c9c-15bf-4eaa-896c-ea5d07066b27\" (UID: \"56723c9c-15bf-4eaa-896c-ea5d07066b27\") " Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.206382 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.208143 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.216422 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts" (OuterVolumeSpecName: "scripts") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.216976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5" (OuterVolumeSpecName: "kube-api-access-24hg5") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "kube-api-access-24hg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.270141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data" (OuterVolumeSpecName: "config-data") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.273854 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56723c9c-15bf-4eaa-896c-ea5d07066b27" (UID: "56723c9c-15bf-4eaa-896c-ea5d07066b27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307770 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24hg5\" (UniqueName: \"kubernetes.io/projected/56723c9c-15bf-4eaa-896c-ea5d07066b27-kube-api-access-24hg5\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307805 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307816 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307826 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307845 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56723c9c-15bf-4eaa-896c-ea5d07066b27-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.307854 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56723c9c-15bf-4eaa-896c-ea5d07066b27-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.707398 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.809377 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.832913 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:45:40 crc kubenswrapper[4895]: I1202 07:45:40.941003 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:40 crc kubenswrapper[4895]: W1202 07:45:40.994071 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90f80179_e9a8_493a_9a05_aec3d0b4be72.slice/crio-e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3 WatchSource:0}: Error finding container e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3: Status 404 returned error can't find the container with id e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3 Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.073883 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerStarted","Data":"5cdbed925ded924f27c17fb3cc3435bd96fdaf4a3cb9e629d8cdf40911775a0e"} Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.078760 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" event={"ID":"90f80179-e9a8-493a-9a05-aec3d0b4be72","Type":"ContainerStarted","Data":"e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3"} Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.080308 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerStarted","Data":"1206a45e4999e55a0b5d421860da4163e9db962fe077072543287e4b6ba17c1f"} Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.082394 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerStarted","Data":"65e1edc58fc552f0456df5454e008ed05f70d97b624af09ecff7c585dfaa6423"} Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.082599 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-central-agent" containerID="cri-o://8bfb9fa755b1d3e73df5a493d8f23979f6da47a21ed4d8ae8683e54eed3de8a6" gracePeriod=30 Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.082721 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.083267 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="proxy-httpd" containerID="cri-o://65e1edc58fc552f0456df5454e008ed05f70d97b624af09ecff7c585dfaa6423" gracePeriod=30 Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.083337 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="sg-core" containerID="cri-o://b4214addbfaabc09997052827a3fdefb258c86367a4da84d2c8edacf38b19b42" gracePeriod=30 Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.083400 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-notification-agent" containerID="cri-o://ef44fd0d43646d18ce0e1b8c6bdb73245931a867ce992bd072016aac43c5116a" gracePeriod=30 Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.091401 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-44vd8" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.092337 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerStarted","Data":"bcfc57872432b827c048052a8d3a082a203e58df3d47f686b28c5e697df59acc"} Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.123030 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.689617109 podStartE2EDuration="51.122978897s" podCreationTimestamp="2025-12-02 07:44:50 +0000 UTC" firstStartedPulling="2025-12-02 07:44:51.763006993 +0000 UTC m=+1302.933866606" lastFinishedPulling="2025-12-02 07:45:40.196368781 +0000 UTC m=+1351.367228394" observedRunningTime="2025-12-02 07:45:41.11242534 +0000 UTC m=+1352.283284983" watchObservedRunningTime="2025-12-02 07:45:41.122978897 +0000 UTC m=+1352.293838510" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.410709 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:41 crc kubenswrapper[4895]: E1202 07:45:41.411815 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" containerName="cinder-db-sync" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.411842 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" containerName="cinder-db-sync" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.419046 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" containerName="cinder-db-sync" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.422452 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.429042 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.429333 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vbt2f" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.434123 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.434124 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.452709 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.521367 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.552152 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.552224 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjgc\" (UniqueName: \"kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.552265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.554952 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.555178 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.555280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.556217 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.560801 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.596672 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.657586 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.657929 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658024 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658131 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658235 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjgc\" (UniqueName: \"kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658330 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658614 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8sm2\" (UniqueName: \"kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658761 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658844 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.658948 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.659306 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.664648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.665314 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.665497 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.673148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.686202 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjgc\" (UniqueName: \"kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc\") pod \"cinder-scheduler-0\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.747093 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761346 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761422 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761446 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761498 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761566 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8sm2\" (UniqueName: \"kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.761662 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.764328 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.765053 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.765849 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.766846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.770293 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.785297 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8sm2\" (UniqueName: \"kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2\") pod \"dnsmasq-dns-5c9776ccc5-jl55c\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.806855 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.808881 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.813244 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.826846 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.900402 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967304 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzgmc\" (UniqueName: \"kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967341 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967427 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967455 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967511 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:41 crc kubenswrapper[4895]: I1202 07:45:41.967534 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078178 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078290 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078447 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzgmc\" (UniqueName: \"kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078474 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.078521 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.086819 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.086829 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.088434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.101318 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.105792 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.113138 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.121701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzgmc\" (UniqueName: \"kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc\") pod \"cinder-api-0\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.150905 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.179599 4895 generic.go:334] "Generic (PLEG): container finished" podID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerID="65e1edc58fc552f0456df5454e008ed05f70d97b624af09ecff7c585dfaa6423" exitCode=0 Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.179999 4895 generic.go:334] "Generic (PLEG): container finished" podID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerID="b4214addbfaabc09997052827a3fdefb258c86367a4da84d2c8edacf38b19b42" exitCode=2 Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.180008 4895 generic.go:334] "Generic (PLEG): container finished" podID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerID="8bfb9fa755b1d3e73df5a493d8f23979f6da47a21ed4d8ae8683e54eed3de8a6" exitCode=0 Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.180060 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerDied","Data":"65e1edc58fc552f0456df5454e008ed05f70d97b624af09ecff7c585dfaa6423"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.180100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerDied","Data":"b4214addbfaabc09997052827a3fdefb258c86367a4da84d2c8edacf38b19b42"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.180111 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerDied","Data":"8bfb9fa755b1d3e73df5a493d8f23979f6da47a21ed4d8ae8683e54eed3de8a6"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.203626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerStarted","Data":"cf4d65ab2ba96ae4bc80a0b7d054ed0e797b68492137a02f6a2798347f9acbf4"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.203693 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerStarted","Data":"e82651dcffa9a06d933859b1cefdfb52d2b7e7e3c9403401e298a1029d5a87ca"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.204712 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.204782 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.232866 4895 generic.go:334] "Generic (PLEG): container finished" podID="90f80179-e9a8-493a-9a05-aec3d0b4be72" containerID="c05b51454f75e9c380a3561e712204a194fb6d6f8d99cf642bc77d83cf407d79" exitCode=0 Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.232918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" event={"ID":"90f80179-e9a8-493a-9a05-aec3d0b4be72","Type":"ContainerDied","Data":"c05b51454f75e9c380a3561e712204a194fb6d6f8d99cf642bc77d83cf407d79"} Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.301347 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-89bf75f54-8mw6n" podStartSLOduration=3.301327185 podStartE2EDuration="3.301327185s" podCreationTimestamp="2025-12-02 07:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:42.265770189 +0000 UTC m=+1353.436629812" watchObservedRunningTime="2025-12-02 07:45:42.301327185 +0000 UTC m=+1353.472186808" Dec 02 07:45:42 crc kubenswrapper[4895]: E1202 07:45:42.528404 4895 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 07:45:42 crc kubenswrapper[4895]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/90f80179-e9a8-493a-9a05-aec3d0b4be72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 07:45:42 crc kubenswrapper[4895]: > podSandboxID="e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3" Dec 02 07:45:42 crc kubenswrapper[4895]: E1202 07:45:42.529163 4895 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 07:45:42 crc kubenswrapper[4895]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfhsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-nmt5c_openstack(90f80179-e9a8-493a-9a05-aec3d0b4be72): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/90f80179-e9a8-493a-9a05-aec3d0b4be72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 07:45:42 crc kubenswrapper[4895]: > logger="UnhandledError" Dec 02 07:45:42 crc kubenswrapper[4895]: E1202 07:45:42.530381 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/90f80179-e9a8-493a-9a05-aec3d0b4be72/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" podUID="90f80179-e9a8-493a-9a05-aec3d0b4be72" Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.706919 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.803004 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:45:42 crc kubenswrapper[4895]: I1202 07:45:42.822050 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.243961 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerStarted","Data":"c88956d6f55f08f070e7d2ece3d0aad0a69f331aabc551f889f5cb4561ee107c"} Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.249520 4895 generic.go:334] "Generic (PLEG): container finished" podID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerID="ef44fd0d43646d18ce0e1b8c6bdb73245931a867ce992bd072016aac43c5116a" exitCode=0 Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.249544 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerDied","Data":"ef44fd0d43646d18ce0e1b8c6bdb73245931a867ce992bd072016aac43c5116a"} Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.773973 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.863566 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.863630 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.863896 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.863985 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.864069 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.864121 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhsw\" (UniqueName: \"kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw\") pod \"90f80179-e9a8-493a-9a05-aec3d0b4be72\" (UID: \"90f80179-e9a8-493a-9a05-aec3d0b4be72\") " Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.886489 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw" (OuterVolumeSpecName: "kube-api-access-dfhsw") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "kube-api-access-dfhsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.975603 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhsw\" (UniqueName: \"kubernetes.io/projected/90f80179-e9a8-493a-9a05-aec3d0b4be72-kube-api-access-dfhsw\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:43 crc kubenswrapper[4895]: I1202 07:45:43.998006 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config" (OuterVolumeSpecName: "config") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.026295 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.030660 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.040867 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.052878 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.056113 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90f80179-e9a8-493a-9a05-aec3d0b4be72" (UID: "90f80179-e9a8-493a-9a05-aec3d0b4be72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.077427 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.077465 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.077480 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.077494 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.077507 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90f80179-e9a8-493a-9a05-aec3d0b4be72-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.178787 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.178910 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.178937 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.178960 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.179083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.179171 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.179263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxbg4\" (UniqueName: \"kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4\") pod \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\" (UID: \"28a2d92c-8fd5-43a8-813c-7b1c49264fcd\") " Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.182488 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.183171 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.185552 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4" (OuterVolumeSpecName: "kube-api-access-kxbg4") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "kube-api-access-kxbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.187919 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts" (OuterVolumeSpecName: "scripts") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.225882 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.274264 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.279118 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" event={"ID":"8510271f-316d-4292-8186-a8003fea402a","Type":"ContainerStarted","Data":"f5a1ee8b278e30a9be1958e42c02e647676dd00646d873fb4d16da06fdc6aab0"} Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281584 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281614 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281629 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281645 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281656 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.281667 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxbg4\" (UniqueName: \"kubernetes.io/projected/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-kube-api-access-kxbg4\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.283011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerStarted","Data":"262b70e5e0ed673c7ee5c30dd08276507e56747ff711448a272a7e479d08fbea"} Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.288122 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" event={"ID":"90f80179-e9a8-493a-9a05-aec3d0b4be72","Type":"ContainerDied","Data":"e9479ba5e1f7ac6852665f42cf9d0b6a9139e432e944137e9f0f331cec6623d3"} Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.288201 4895 scope.go:117] "RemoveContainer" containerID="c05b51454f75e9c380a3561e712204a194fb6d6f8d99cf642bc77d83cf407d79" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.288210 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-nmt5c" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.292950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerStarted","Data":"1ea05e687809a1075b370d099e40ef305622b4839ae26a0439d53df787025e36"} Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.302969 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.304051 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"28a2d92c-8fd5-43a8-813c-7b1c49264fcd","Type":"ContainerDied","Data":"074c4f0214159d8576c6c43d0b4e82313b1bda2590e848736b79ba02629dd243"} Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.304353 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data" (OuterVolumeSpecName: "config-data") pod "28a2d92c-8fd5-43a8-813c-7b1c49264fcd" (UID: "28a2d92c-8fd5-43a8-813c-7b1c49264fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.351316 4895 scope.go:117] "RemoveContainer" containerID="65e1edc58fc552f0456df5454e008ed05f70d97b624af09ecff7c585dfaa6423" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.379588 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.389140 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2d92c-8fd5-43a8-813c-7b1c49264fcd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.447330 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.472919 4895 scope.go:117] "RemoveContainer" containerID="b4214addbfaabc09997052827a3fdefb258c86367a4da84d2c8edacf38b19b42" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.516778 4895 scope.go:117] "RemoveContainer" containerID="ef44fd0d43646d18ce0e1b8c6bdb73245931a867ce992bd072016aac43c5116a" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.520904 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.533986 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:45:44 crc kubenswrapper[4895]: E1202 07:45:44.534698 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="proxy-httpd" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.534721 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="proxy-httpd" Dec 02 07:45:44 crc kubenswrapper[4895]: E1202 07:45:44.534768 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-notification-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.534779 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-notification-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: E1202 07:45:44.534790 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-central-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.534796 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-central-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: E1202 07:45:44.534822 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="sg-core" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.534829 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="sg-core" Dec 02 07:45:44 crc kubenswrapper[4895]: E1202 07:45:44.534838 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f80179-e9a8-493a-9a05-aec3d0b4be72" containerName="init" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.534843 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f80179-e9a8-493a-9a05-aec3d0b4be72" containerName="init" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.535175 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-notification-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.535186 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f80179-e9a8-493a-9a05-aec3d0b4be72" containerName="init" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.535197 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="proxy-httpd" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.535206 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="sg-core" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.535218 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" containerName="ceilometer-central-agent" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.540339 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.544091 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.545033 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.545228 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-nmt5c"] Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.557020 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.607211 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.607701 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4lb\" (UniqueName: \"kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.607860 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.608005 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.608079 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.608205 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.608416 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.624515 4895 scope.go:117] "RemoveContainer" containerID="8bfb9fa755b1d3e73df5a493d8f23979f6da47a21ed4d8ae8683e54eed3de8a6" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709698 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709766 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709818 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709862 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709927 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4lb\" (UniqueName: \"kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.709946 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.711300 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.711418 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.716983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.719883 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.720284 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.720924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.731292 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4lb\" (UniqueName: \"kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb\") pod \"ceilometer-0\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " pod="openstack/ceilometer-0" Dec 02 07:45:44 crc kubenswrapper[4895]: I1202 07:45:44.920359 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.199777 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a2d92c-8fd5-43a8-813c-7b1c49264fcd" path="/var/lib/kubelet/pods/28a2d92c-8fd5-43a8-813c-7b1c49264fcd/volumes" Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.201258 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f80179-e9a8-493a-9a05-aec3d0b4be72" path="/var/lib/kubelet/pods/90f80179-e9a8-493a-9a05-aec3d0b4be72/volumes" Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.336020 4895 generic.go:334] "Generic (PLEG): container finished" podID="8510271f-316d-4292-8186-a8003fea402a" containerID="95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05" exitCode=0 Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.336103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" event={"ID":"8510271f-316d-4292-8186-a8003fea402a","Type":"ContainerDied","Data":"95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.411038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerStarted","Data":"8e8854b87804d789cdb1f9a8afc05325856757545c54d3bc84cc2e97c1d6f4df"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.504002 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.600395 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerStarted","Data":"87a341d01cbe5679c7f66108701ad133b21f9226861ceb315e694aa0b420673a"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.639379 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.652494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerStarted","Data":"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.712035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerStarted","Data":"7d6a5cbf4ac42d7b9bcb1f16b7d852ad3e604b72c9dfa43a52ca193e0d0f7f4e"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.713774 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerStarted","Data":"68d4a2538c6c04477ff11aefd007fcd9450afdb38ccbda6a64db4e5f865071b1"} Dec 02 07:45:45 crc kubenswrapper[4895]: I1202 07:45:45.717065 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" podStartSLOduration=5.122632504 podStartE2EDuration="7.717039171s" podCreationTimestamp="2025-12-02 07:45:38 +0000 UTC" firstStartedPulling="2025-12-02 07:45:40.985402314 +0000 UTC m=+1352.156261927" lastFinishedPulling="2025-12-02 07:45:43.579808981 +0000 UTC m=+1354.750668594" observedRunningTime="2025-12-02 07:45:45.696487047 +0000 UTC m=+1356.867346660" watchObservedRunningTime="2025-12-02 07:45:45.717039171 +0000 UTC m=+1356.887898784" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.357721 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-b969f4967-hmqp8" podStartSLOduration=5.490529038 podStartE2EDuration="8.357697878s" podCreationTimestamp="2025-12-02 07:45:38 +0000 UTC" firstStartedPulling="2025-12-02 07:45:40.71777909 +0000 UTC m=+1351.888638703" lastFinishedPulling="2025-12-02 07:45:43.58494793 +0000 UTC m=+1354.755807543" observedRunningTime="2025-12-02 07:45:45.806664605 +0000 UTC m=+1356.977524218" watchObservedRunningTime="2025-12-02 07:45:46.357697878 +0000 UTC m=+1357.528557491" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.390367 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.392359 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.395680 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.401015 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.427014 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.510327 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.510908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.510947 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.510990 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.511063 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxm6c\" (UniqueName: \"kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.511162 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.511219 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613491 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxm6c\" (UniqueName: \"kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613612 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613783 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613812 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.613850 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.614425 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.624094 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.624154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.624519 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.625836 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.626195 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.635227 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxm6c\" (UniqueName: \"kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c\") pod \"barbican-api-788d454954-brr26\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.736930 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.741523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" event={"ID":"8510271f-316d-4292-8186-a8003fea402a","Type":"ContainerStarted","Data":"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d"} Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.741947 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.753771 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerStarted","Data":"d1c1b8460b278a4a021b28e19501e2600387590063cf3d70b788396e89c503b0"} Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.754220 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api-log" containerID="cri-o://8e8854b87804d789cdb1f9a8afc05325856757545c54d3bc84cc2e97c1d6f4df" gracePeriod=30 Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.754366 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.754409 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api" containerID="cri-o://d1c1b8460b278a4a021b28e19501e2600387590063cf3d70b788396e89c503b0" gracePeriod=30 Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.758432 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerStarted","Data":"acd8159f31b6a2b7877e0052baa193e5ccf1701251075ff824f4e2e31674f969"} Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.758493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerStarted","Data":"252063a448124a283c7f67c6f4d9291c5a9b4dde9568f6541e5a20aee8c80c47"} Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.773907 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerStarted","Data":"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787"} Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.789885 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" podStartSLOduration=5.789848766 podStartE2EDuration="5.789848766s" podCreationTimestamp="2025-12-02 07:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:46.766803475 +0000 UTC m=+1357.937663108" watchObservedRunningTime="2025-12-02 07:45:46.789848766 +0000 UTC m=+1357.960708379" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.826230 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.826206127 podStartE2EDuration="5.826206127s" podCreationTimestamp="2025-12-02 07:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:46.795230001 +0000 UTC m=+1357.966089614" watchObservedRunningTime="2025-12-02 07:45:46.826206127 +0000 UTC m=+1357.997065730" Dec 02 07:45:46 crc kubenswrapper[4895]: I1202 07:45:46.834641 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.558406639 podStartE2EDuration="5.834613836s" podCreationTimestamp="2025-12-02 07:45:41 +0000 UTC" firstStartedPulling="2025-12-02 07:45:42.71687921 +0000 UTC m=+1353.887738823" lastFinishedPulling="2025-12-02 07:45:43.993086407 +0000 UTC m=+1355.163946020" observedRunningTime="2025-12-02 07:45:46.825241737 +0000 UTC m=+1357.996101370" watchObservedRunningTime="2025-12-02 07:45:46.834613836 +0000 UTC m=+1358.005473449" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.283793 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:45:47 crc kubenswrapper[4895]: W1202 07:45:47.320309 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3d2bb1c_bd20_473e_b91a_7e2a63ec9f07.slice/crio-1cc3f46f7c91409910c521462a433b139a38e5268d48fc41e8dc3a7977ee1078 WatchSource:0}: Error finding container 1cc3f46f7c91409910c521462a433b139a38e5268d48fc41e8dc3a7977ee1078: Status 404 returned error can't find the container with id 1cc3f46f7c91409910c521462a433b139a38e5268d48fc41e8dc3a7977ee1078 Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.800110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerStarted","Data":"1cc3f46f7c91409910c521462a433b139a38e5268d48fc41e8dc3a7977ee1078"} Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.808979 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerDied","Data":"d1c1b8460b278a4a021b28e19501e2600387590063cf3d70b788396e89c503b0"} Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.809088 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerID="d1c1b8460b278a4a021b28e19501e2600387590063cf3d70b788396e89c503b0" exitCode=0 Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.809172 4895 generic.go:334] "Generic (PLEG): container finished" podID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerID="8e8854b87804d789cdb1f9a8afc05325856757545c54d3bc84cc2e97c1d6f4df" exitCode=143 Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.809297 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerDied","Data":"8e8854b87804d789cdb1f9a8afc05325856757545c54d3bc84cc2e97c1d6f4df"} Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.850856 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906383 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906448 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzgmc\" (UniqueName: \"kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906759 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906823 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906872 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.906888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id\") pod \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\" (UID: \"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a\") " Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.907429 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs" (OuterVolumeSpecName: "logs") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.910485 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.915807 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.916147 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc" (OuterVolumeSpecName: "kube-api-access-dzgmc") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "kube-api-access-dzgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.916293 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts" (OuterVolumeSpecName: "scripts") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.938312 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:47 crc kubenswrapper[4895]: I1202 07:45:47.983837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data" (OuterVolumeSpecName: "config-data") pod "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" (UID: "e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010884 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010921 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010931 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010941 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010951 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010961 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.010971 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzgmc\" (UniqueName: \"kubernetes.io/projected/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a-kube-api-access-dzgmc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.821826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerStarted","Data":"8f6e08f059d8d10b34bda28a99cf993bc10f7153af260e881771ef9437a89f77"} Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.822948 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.822970 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerStarted","Data":"915e5d2b5f5c95e83c1104dc0136dd664c02203a632c18741317fd352c1f6413"} Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.825543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a","Type":"ContainerDied","Data":"262b70e5e0ed673c7ee5c30dd08276507e56747ff711448a272a7e479d08fbea"} Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.825672 4895 scope.go:117] "RemoveContainer" containerID="d1c1b8460b278a4a021b28e19501e2600387590063cf3d70b788396e89c503b0" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.825588 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.828371 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerStarted","Data":"91d73afad30f868a96875b2c1fc42a6c1c98b6a198695f9963c6980bee5da657"} Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.884428 4895 scope.go:117] "RemoveContainer" containerID="8e8854b87804d789cdb1f9a8afc05325856757545c54d3bc84cc2e97c1d6f4df" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.904957 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-788d454954-brr26" podStartSLOduration=2.904932551 podStartE2EDuration="2.904932551s" podCreationTimestamp="2025-12-02 07:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:48.88479335 +0000 UTC m=+1360.055652963" watchObservedRunningTime="2025-12-02 07:45:48.904932551 +0000 UTC m=+1360.075792164" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.947511 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.972308 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.988137 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:48 crc kubenswrapper[4895]: E1202 07:45:48.988703 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.996549 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api" Dec 02 07:45:48 crc kubenswrapper[4895]: E1202 07:45:48.996728 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api-log" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.996815 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api-log" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.997325 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api-log" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.997398 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" containerName="cinder-api" Dec 02 07:45:48 crc kubenswrapper[4895]: I1202 07:45:48.998572 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.003128 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.003578 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.004370 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.016473 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033121 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033192 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtlc\" (UniqueName: \"kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033357 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.033381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141318 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141357 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141381 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141415 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141444 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141469 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.141500 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtlc\" (UniqueName: \"kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.142778 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.159483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.160012 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.165375 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.165414 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.166579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.168022 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.179533 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.183301 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a" path="/var/lib/kubelet/pods/e6a8a0b0-ea5b-436e-a9a3-7c75ecca485a/volumes" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.197682 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtlc\" (UniqueName: \"kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc\") pod \"cinder-api-0\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.231335 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.347899 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.851969 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerStarted","Data":"aab79fb37dce1fea7b6d2b53ac3b6a01bb5b1ed5f2e2a6dbbd27161499d2c679"} Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.852600 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:49 crc kubenswrapper[4895]: I1202 07:45:49.944692 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:45:49 crc kubenswrapper[4895]: W1202 07:45:49.954302 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebbed0ba_1d44_4421_a276_b075b0f31c3f.slice/crio-afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81 WatchSource:0}: Error finding container afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81: Status 404 returned error can't find the container with id afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81 Dec 02 07:45:50 crc kubenswrapper[4895]: I1202 07:45:50.886169 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerStarted","Data":"e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5"} Dec 02 07:45:50 crc kubenswrapper[4895]: I1202 07:45:50.886677 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerStarted","Data":"afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81"} Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.417266 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.502023 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.748314 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.896035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerStarted","Data":"3a0d36cdfb3f77e74dda0c49d0558e6c7571700d4bfd6cdaa1acbb5f35e6a972"} Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.897487 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.906937 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.926446 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerStarted","Data":"861f03ad651fa43aee376705baf3d11ccb4c41a4ac00fed84a1660f83a957992"} Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.926497 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.935785 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.935761378 podStartE2EDuration="3.935761378s" podCreationTimestamp="2025-12-02 07:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:51.922008904 +0000 UTC m=+1363.092868507" watchObservedRunningTime="2025-12-02 07:45:51.935761378 +0000 UTC m=+1363.106620991" Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.997428 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:51 crc kubenswrapper[4895]: I1202 07:45:51.997777 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="dnsmasq-dns" containerID="cri-o://940d3a91fbb2068793a0307956bc5c1a1b80502bb0c0f171208f9cca2368ccec" gracePeriod=10 Dec 02 07:45:52 crc kubenswrapper[4895]: I1202 07:45:52.021101 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.156555443 podStartE2EDuration="8.021076969s" podCreationTimestamp="2025-12-02 07:45:44 +0000 UTC" firstStartedPulling="2025-12-02 07:45:45.738042989 +0000 UTC m=+1356.908902592" lastFinishedPulling="2025-12-02 07:45:50.602564505 +0000 UTC m=+1361.773424118" observedRunningTime="2025-12-02 07:45:52.018485169 +0000 UTC m=+1363.189344782" watchObservedRunningTime="2025-12-02 07:45:52.021076969 +0000 UTC m=+1363.191936582" Dec 02 07:45:52 crc kubenswrapper[4895]: I1202 07:45:52.248212 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 07:45:52 crc kubenswrapper[4895]: I1202 07:45:52.299218 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:52 crc kubenswrapper[4895]: I1202 07:45:52.520389 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:52 crc kubenswrapper[4895]: I1202 07:45:52.641509 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.243126 4895 generic.go:334] "Generic (PLEG): container finished" podID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerID="940d3a91fbb2068793a0307956bc5c1a1b80502bb0c0f171208f9cca2368ccec" exitCode=0 Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.243453 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="cinder-scheduler" containerID="cri-o://3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" gracePeriod=30 Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.246648 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="probe" containerID="cri-o://d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" gracePeriod=30 Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.261447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" event={"ID":"540a6298-e74a-48b3-aa5d-93d03c6871de","Type":"ContainerDied","Data":"940d3a91fbb2068793a0307956bc5c1a1b80502bb0c0f171208f9cca2368ccec"} Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.732055 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.752547 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.823604 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.826037 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-558857dd7b-r29g8" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-api" containerID="cri-o://35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06" gracePeriod=30 Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.826652 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-558857dd7b-r29g8" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-httpd" containerID="cri-o://1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75" gracePeriod=30 Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.887833 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.887981 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.888146 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8xxk\" (UniqueName: \"kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.888185 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.888212 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.888301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb\") pod \"540a6298-e74a-48b3-aa5d-93d03c6871de\" (UID: \"540a6298-e74a-48b3-aa5d-93d03c6871de\") " Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.903020 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk" (OuterVolumeSpecName: "kube-api-access-f8xxk") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "kube-api-access-f8xxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:53 crc kubenswrapper[4895]: I1202 07:45:53.992105 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8xxk\" (UniqueName: \"kubernetes.io/projected/540a6298-e74a-48b3-aa5d-93d03c6871de-kube-api-access-f8xxk\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.002209 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.080990 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.087465 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.098518 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.098573 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.098589 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.121175 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.121837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config" (OuterVolumeSpecName: "config") pod "540a6298-e74a-48b3-aa5d-93d03c6871de" (UID: "540a6298-e74a-48b3-aa5d-93d03c6871de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.203490 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.203539 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/540a6298-e74a-48b3-aa5d-93d03c6871de-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.268196 4895 generic.go:334] "Generic (PLEG): container finished" podID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerID="1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75" exitCode=0 Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.268275 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerDied","Data":"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75"} Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.270155 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.270208 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nk9m9" event={"ID":"540a6298-e74a-48b3-aa5d-93d03c6871de","Type":"ContainerDied","Data":"f5f472ae41ae3434b5ddbe57cbd0ad299ac197f6ccf9f7eae48fdc85a2ba886e"} Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.270248 4895 scope.go:117] "RemoveContainer" containerID="940d3a91fbb2068793a0307956bc5c1a1b80502bb0c0f171208f9cca2368ccec" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.309057 4895 scope.go:117] "RemoveContainer" containerID="d7ddf51525ffcff16a99c0e84742f8bee980e8ae5d18775d07c033636237fd92" Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.342833 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:54 crc kubenswrapper[4895]: I1202 07:45:54.354225 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nk9m9"] Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.159791 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" path="/var/lib/kubelet/pods/540a6298-e74a-48b3-aa5d-93d03c6871de/volumes" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.251814 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310585 4895 generic.go:334] "Generic (PLEG): container finished" podID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerID="d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" exitCode=0 Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310637 4895 generic.go:334] "Generic (PLEG): container finished" podID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerID="3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" exitCode=0 Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310671 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerDied","Data":"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787"} Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310932 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerDied","Data":"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a"} Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5","Type":"ContainerDied","Data":"c88956d6f55f08f070e7d2ece3d0aad0a69f331aabc551f889f5cb4561ee107c"} Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.310966 4895 scope.go:117] "RemoveContainer" containerID="d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.311167 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329461 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329595 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329772 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329796 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329815 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329865 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.329968 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hjgc\" (UniqueName: \"kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc\") pod \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\" (UID: \"addf06e4-2353-4d9e-8c8a-8f8643c3b4a5\") " Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.331777 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.337966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc" (OuterVolumeSpecName: "kube-api-access-9hjgc") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "kube-api-access-9hjgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.354293 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.362525 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts" (OuterVolumeSpecName: "scripts") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.409469 4895 scope.go:117] "RemoveContainer" containerID="3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.416439 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.434906 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.434965 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.434975 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.434987 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hjgc\" (UniqueName: \"kubernetes.io/projected/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-kube-api-access-9hjgc\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.449099 4895 scope.go:117] "RemoveContainer" containerID="d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.449627 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787\": container with ID starting with d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787 not found: ID does not exist" containerID="d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.449667 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787"} err="failed to get container status \"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787\": rpc error: code = NotFound desc = could not find container \"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787\": container with ID starting with d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787 not found: ID does not exist" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.449693 4895 scope.go:117] "RemoveContainer" containerID="3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.450624 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a\": container with ID starting with 3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a not found: ID does not exist" containerID="3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.450653 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a"} err="failed to get container status \"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a\": rpc error: code = NotFound desc = could not find container \"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a\": container with ID starting with 3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a not found: ID does not exist" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.450666 4895 scope.go:117] "RemoveContainer" containerID="d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.450944 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787"} err="failed to get container status \"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787\": rpc error: code = NotFound desc = could not find container \"d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787\": container with ID starting with d1302c749db51673b22f1213ca7ddcebd8a85299ad22b3c296705264cca0d787 not found: ID does not exist" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.450972 4895 scope.go:117] "RemoveContainer" containerID="3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.451120 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a"} err="failed to get container status \"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a\": rpc error: code = NotFound desc = could not find container \"3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a\": container with ID starting with 3c0b12770b942fb28f5657b757228bd6b23fcb3181571e0bc9090cf73fff6b0a not found: ID does not exist" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.494710 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data" (OuterVolumeSpecName: "config-data") pod "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" (UID: "addf06e4-2353-4d9e-8c8a-8f8643c3b4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.536640 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.673398 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.684341 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.696409 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.696874 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="init" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.696896 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="init" Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.696914 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="cinder-scheduler" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.696925 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="cinder-scheduler" Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.696948 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="probe" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.696957 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="probe" Dec 02 07:45:55 crc kubenswrapper[4895]: E1202 07:45:55.696976 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="dnsmasq-dns" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.696981 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="dnsmasq-dns" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.697162 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="probe" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.697185 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" containerName="cinder-scheduler" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.697200 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="540a6298-e74a-48b3-aa5d-93d03c6871de" containerName="dnsmasq-dns" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.698451 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.703809 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.713678 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.842360 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5fb\" (UniqueName: \"kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.842783 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.843046 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.843245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.843646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.843805 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946210 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946291 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946377 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946408 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946444 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp5fb\" (UniqueName: \"kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.946468 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.956105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.956182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.966209 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.968801 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.984537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:55 crc kubenswrapper[4895]: I1202 07:45:55.991313 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp5fb\" (UniqueName: \"kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb\") pod \"cinder-scheduler-0\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " pod="openstack/cinder-scheduler-0" Dec 02 07:45:56 crc kubenswrapper[4895]: I1202 07:45:56.021614 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:45:56 crc kubenswrapper[4895]: W1202 07:45:56.771832 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod836bba81_425e_4610_b191_2bbb2cfc1f79.slice/crio-c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517 WatchSource:0}: Error finding container c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517: Status 404 returned error can't find the container with id c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517 Dec 02 07:45:56 crc kubenswrapper[4895]: I1202 07:45:56.776380 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:45:57 crc kubenswrapper[4895]: I1202 07:45:57.157467 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addf06e4-2353-4d9e-8c8a-8f8643c3b4a5" path="/var/lib/kubelet/pods/addf06e4-2353-4d9e-8c8a-8f8643c3b4a5/volumes" Dec 02 07:45:57 crc kubenswrapper[4895]: I1202 07:45:57.368873 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerStarted","Data":"c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517"} Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.226072 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.389468 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerStarted","Data":"8a861d47b18ce485f266fa0a57adf3455c385cacb617b37e3b45a4bd17799c71"} Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.389527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerStarted","Data":"bd9e831f88d074ed4ebcb3f0c21947564533211ce824af698b616217e7b83e86"} Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.426666 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.426638858 podStartE2EDuration="3.426638858s" podCreationTimestamp="2025-12-02 07:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:45:58.422349586 +0000 UTC m=+1369.593209209" watchObservedRunningTime="2025-12-02 07:45:58.426638858 +0000 UTC m=+1369.597498501" Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.711428 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:58 crc kubenswrapper[4895]: I1202 07:45:58.948613 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:45:59 crc kubenswrapper[4895]: I1202 07:45:59.038429 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:45:59 crc kubenswrapper[4895]: I1202 07:45:59.038727 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-89bf75f54-8mw6n" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api-log" containerID="cri-o://e82651dcffa9a06d933859b1cefdfb52d2b7e7e3c9403401e298a1029d5a87ca" gracePeriod=30 Dec 02 07:45:59 crc kubenswrapper[4895]: I1202 07:45:59.038878 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-89bf75f54-8mw6n" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api" containerID="cri-o://cf4d65ab2ba96ae4bc80a0b7d054ed0e797b68492137a02f6a2798347f9acbf4" gracePeriod=30 Dec 02 07:45:59 crc kubenswrapper[4895]: I1202 07:45:59.401249 4895 generic.go:334] "Generic (PLEG): container finished" podID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerID="e82651dcffa9a06d933859b1cefdfb52d2b7e7e3c9403401e298a1029d5a87ca" exitCode=143 Dec 02 07:45:59 crc kubenswrapper[4895]: I1202 07:45:59.402395 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerDied","Data":"e82651dcffa9a06d933859b1cefdfb52d2b7e7e3c9403401e298a1029d5a87ca"} Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.353360 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.427400 4895 generic.go:334] "Generic (PLEG): container finished" podID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerID="35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06" exitCode=0 Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.427461 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerDied","Data":"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06"} Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.427497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-558857dd7b-r29g8" event={"ID":"bff723dc-9ac5-4a07-bcac-1c43b1007d41","Type":"ContainerDied","Data":"19d1141a1e44741a385d60aba84cf1c0e30d559ce9f211ab11e2bf6235c4b2a8"} Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.427522 4895 scope.go:117] "RemoveContainer" containerID="1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.427800 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-558857dd7b-r29g8" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.454878 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 07:46:00 crc kubenswrapper[4895]: E1202 07:46:00.455535 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-api" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.455554 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-api" Dec 02 07:46:00 crc kubenswrapper[4895]: E1202 07:46:00.455589 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-httpd" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.455599 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-httpd" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.455892 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-httpd" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.455923 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" containerName="neutron-api" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.456930 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.461212 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cfz96" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.461421 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.466088 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.470360 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8jbh\" (UniqueName: \"kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh\") pod \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.470441 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle\") pod \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.470618 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs\") pod \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.470757 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config\") pod \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.472478 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config\") pod \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\" (UID: \"bff723dc-9ac5-4a07-bcac-1c43b1007d41\") " Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.475085 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.511504 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bff723dc-9ac5-4a07-bcac-1c43b1007d41" (UID: "bff723dc-9ac5-4a07-bcac-1c43b1007d41"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.534345 4895 scope.go:117] "RemoveContainer" containerID="35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.572390 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bff723dc-9ac5-4a07-bcac-1c43b1007d41" (UID: "bff723dc-9ac5-4a07-bcac-1c43b1007d41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.575019 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh" (OuterVolumeSpecName: "kube-api-access-h8jbh") pod "bff723dc-9ac5-4a07-bcac-1c43b1007d41" (UID: "bff723dc-9ac5-4a07-bcac-1c43b1007d41"). InnerVolumeSpecName "kube-api-access-h8jbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.577214 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.577465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.577535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.577563 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xw5\" (UniqueName: \"kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.578269 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.578299 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8jbh\" (UniqueName: \"kubernetes.io/projected/bff723dc-9ac5-4a07-bcac-1c43b1007d41-kube-api-access-h8jbh\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.578310 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.603926 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bff723dc-9ac5-4a07-bcac-1c43b1007d41" (UID: "bff723dc-9ac5-4a07-bcac-1c43b1007d41"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.633211 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config" (OuterVolumeSpecName: "config") pod "bff723dc-9ac5-4a07-bcac-1c43b1007d41" (UID: "bff723dc-9ac5-4a07-bcac-1c43b1007d41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.635454 4895 scope.go:117] "RemoveContainer" containerID="1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75" Dec 02 07:46:00 crc kubenswrapper[4895]: E1202 07:46:00.636340 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75\": container with ID starting with 1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75 not found: ID does not exist" containerID="1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.636440 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75"} err="failed to get container status \"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75\": rpc error: code = NotFound desc = could not find container \"1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75\": container with ID starting with 1251c1dd7c3b2a3560598b9b45d6575da2e8eba37f3d54df29fb249a9b058a75 not found: ID does not exist" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.636479 4895 scope.go:117] "RemoveContainer" containerID="35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06" Dec 02 07:46:00 crc kubenswrapper[4895]: E1202 07:46:00.637273 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06\": container with ID starting with 35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06 not found: ID does not exist" containerID="35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.637339 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06"} err="failed to get container status \"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06\": rpc error: code = NotFound desc = could not find container \"35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06\": container with ID starting with 35059782385c8b9a5382e746768c0035769e05193982c4458b52a48d2a4d9f06 not found: ID does not exist" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.680856 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.681010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.681035 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xw5\" (UniqueName: \"kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.681131 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.681213 4895 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.681224 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bff723dc-9ac5-4a07-bcac-1c43b1007d41-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.682702 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.690479 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.691064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.706538 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xw5\" (UniqueName: \"kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5\") pod \"openstackclient\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " pod="openstack/openstackclient" Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.773469 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.785098 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-558857dd7b-r29g8"] Dec 02 07:46:00 crc kubenswrapper[4895]: I1202 07:46:00.923413 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 07:46:01 crc kubenswrapper[4895]: I1202 07:46:01.023131 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 07:46:01 crc kubenswrapper[4895]: I1202 07:46:01.171879 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff723dc-9ac5-4a07-bcac-1c43b1007d41" path="/var/lib/kubelet/pods/bff723dc-9ac5-4a07-bcac-1c43b1007d41/volumes" Dec 02 07:46:01 crc kubenswrapper[4895]: I1202 07:46:01.570483 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.282612 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-89bf75f54-8mw6n" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:60448->10.217.0.156:9311: read: connection reset by peer" Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.282709 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-89bf75f54-8mw6n" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:60442->10.217.0.156:9311: read: connection reset by peer" Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.456653 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b52e937-5b7e-4179-9766-20a9c2f93e35","Type":"ContainerStarted","Data":"a4bdc7796a4884edabb4d9057bf9c70e96a6d7a6c1b52b2c5e342919ee7a7b2f"} Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.469707 4895 generic.go:334] "Generic (PLEG): container finished" podID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerID="cf4d65ab2ba96ae4bc80a0b7d054ed0e797b68492137a02f6a2798347f9acbf4" exitCode=0 Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.469785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerDied","Data":"cf4d65ab2ba96ae4bc80a0b7d054ed0e797b68492137a02f6a2798347f9acbf4"} Dec 02 07:46:02 crc kubenswrapper[4895]: I1202 07:46:02.884132 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.051830 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom\") pod \"1eba7853-b279-4568-b36b-09a1c9edb7b6\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.051939 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs\") pod \"1eba7853-b279-4568-b36b-09a1c9edb7b6\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.052086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrwmn\" (UniqueName: \"kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn\") pod \"1eba7853-b279-4568-b36b-09a1c9edb7b6\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.052149 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle\") pod \"1eba7853-b279-4568-b36b-09a1c9edb7b6\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.052262 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data\") pod \"1eba7853-b279-4568-b36b-09a1c9edb7b6\" (UID: \"1eba7853-b279-4568-b36b-09a1c9edb7b6\") " Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.053101 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs" (OuterVolumeSpecName: "logs") pod "1eba7853-b279-4568-b36b-09a1c9edb7b6" (UID: "1eba7853-b279-4568-b36b-09a1c9edb7b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.053463 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eba7853-b279-4568-b36b-09a1c9edb7b6-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.064089 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn" (OuterVolumeSpecName: "kube-api-access-hrwmn") pod "1eba7853-b279-4568-b36b-09a1c9edb7b6" (UID: "1eba7853-b279-4568-b36b-09a1c9edb7b6"). InnerVolumeSpecName "kube-api-access-hrwmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.086979 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1eba7853-b279-4568-b36b-09a1c9edb7b6" (UID: "1eba7853-b279-4568-b36b-09a1c9edb7b6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.092502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eba7853-b279-4568-b36b-09a1c9edb7b6" (UID: "1eba7853-b279-4568-b36b-09a1c9edb7b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.092519 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.156506 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.156540 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrwmn\" (UniqueName: \"kubernetes.io/projected/1eba7853-b279-4568-b36b-09a1c9edb7b6-kube-api-access-hrwmn\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.156569 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.174092 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data" (OuterVolumeSpecName: "config-data") pod "1eba7853-b279-4568-b36b-09a1c9edb7b6" (UID: "1eba7853-b279-4568-b36b-09a1c9edb7b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.259121 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eba7853-b279-4568-b36b-09a1c9edb7b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.485076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-89bf75f54-8mw6n" event={"ID":"1eba7853-b279-4568-b36b-09a1c9edb7b6","Type":"ContainerDied","Data":"5cdbed925ded924f27c17fb3cc3435bd96fdaf4a3cb9e629d8cdf40911775a0e"} Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.485149 4895 scope.go:117] "RemoveContainer" containerID="cf4d65ab2ba96ae4bc80a0b7d054ed0e797b68492137a02f6a2798347f9acbf4" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.485189 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-89bf75f54-8mw6n" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.531774 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.536117 4895 scope.go:117] "RemoveContainer" containerID="e82651dcffa9a06d933859b1cefdfb52d2b7e7e3c9403401e298a1029d5a87ca" Dec 02 07:46:03 crc kubenswrapper[4895]: I1202 07:46:03.546908 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-89bf75f54-8mw6n"] Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.068244 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:46:05 crc kubenswrapper[4895]: E1202 07:46:05.069253 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.069271 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api" Dec 02 07:46:05 crc kubenswrapper[4895]: E1202 07:46:05.069299 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api-log" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.069308 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api-log" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.069568 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.069607 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" containerName="barbican-api-log" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.071360 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.075983 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.076730 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.077284 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.095309 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.169171 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eba7853-b279-4568-b36b-09a1c9edb7b6" path="/var/lib/kubelet/pods/1eba7853-b279-4568-b36b-09a1c9edb7b6/volumes" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.200689 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201298 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnrz\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201397 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201505 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201525 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.201597 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305205 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305284 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305317 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305408 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305529 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305578 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnrz\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305598 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.305644 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.306135 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.307510 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.314457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.315230 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.315870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.318329 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.318641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.327701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnrz\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz\") pod \"swift-proxy-6f6974886f-mmsbz\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.391493 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.473150 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.473239 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.473301 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.474474 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.474551 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af" gracePeriod=600 Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.656082 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.656775 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-central-agent" containerID="cri-o://acd8159f31b6a2b7877e0052baa193e5ccf1701251075ff824f4e2e31674f969" gracePeriod=30 Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.656976 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="proxy-httpd" containerID="cri-o://861f03ad651fa43aee376705baf3d11ccb4c41a4ac00fed84a1660f83a957992" gracePeriod=30 Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.657019 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="sg-core" containerID="cri-o://aab79fb37dce1fea7b6d2b53ac3b6a01bb5b1ed5f2e2a6dbbd27161499d2c679" gracePeriod=30 Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.657048 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-notification-agent" containerID="cri-o://91d73afad30f868a96875b2c1fc42a6c1c98b6a198695f9963c6980bee5da657" gracePeriod=30 Dec 02 07:46:05 crc kubenswrapper[4895]: I1202 07:46:05.775850 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.160:3000/\": read tcp 10.217.0.2:42956->10.217.0.160:3000: read: connection reset by peer" Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.241364 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.453141 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.548525 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af" exitCode=0 Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.549799 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af"} Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.549928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03"} Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.549964 4895 scope.go:117] "RemoveContainer" containerID="2f198fe0feb728e97ed5c4b77927f34e37b1755009f4a942cf361750a2e15740" Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554548 4895 generic.go:334] "Generic (PLEG): container finished" podID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerID="861f03ad651fa43aee376705baf3d11ccb4c41a4ac00fed84a1660f83a957992" exitCode=0 Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554583 4895 generic.go:334] "Generic (PLEG): container finished" podID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerID="aab79fb37dce1fea7b6d2b53ac3b6a01bb5b1ed5f2e2a6dbbd27161499d2c679" exitCode=2 Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554593 4895 generic.go:334] "Generic (PLEG): container finished" podID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerID="acd8159f31b6a2b7877e0052baa193e5ccf1701251075ff824f4e2e31674f969" exitCode=0 Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerDied","Data":"861f03ad651fa43aee376705baf3d11ccb4c41a4ac00fed84a1660f83a957992"} Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554709 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerDied","Data":"aab79fb37dce1fea7b6d2b53ac3b6a01bb5b1ed5f2e2a6dbbd27161499d2c679"} Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.554724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerDied","Data":"acd8159f31b6a2b7877e0052baa193e5ccf1701251075ff824f4e2e31674f969"} Dec 02 07:46:06 crc kubenswrapper[4895]: I1202 07:46:06.557032 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerStarted","Data":"3c171dd2f5f04681f363b255580423e1255efef85f62c66d8427210f76e945e1"} Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.578582 4895 generic.go:334] "Generic (PLEG): container finished" podID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerID="91d73afad30f868a96875b2c1fc42a6c1c98b6a198695f9963c6980bee5da657" exitCode=0 Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.578691 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerDied","Data":"91d73afad30f868a96875b2c1fc42a6c1c98b6a198695f9963c6980bee5da657"} Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.583455 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerStarted","Data":"c0d40bd925f15211d99af8cacd53d2e85f799a87ce053777a156c72dcd0fd1bc"} Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.583532 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerStarted","Data":"8905a04cd6af553d962d3110181cd121c314f07c69d9726566c2e6fedcbedc7d"} Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.583912 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.645539 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f6974886f-mmsbz" podStartSLOduration=2.645510656 podStartE2EDuration="2.645510656s" podCreationTimestamp="2025-12-02 07:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:46:07.605727679 +0000 UTC m=+1378.776587312" watchObservedRunningTime="2025-12-02 07:46:07.645510656 +0000 UTC m=+1378.816370269" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.865043 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981051 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981162 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981473 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981623 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4lb\" (UniqueName: \"kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb\") pod \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\" (UID: \"1418eba9-bbbe-44f2-bb65-7081c8e0f25f\") " Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981692 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.981923 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.982695 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.982721 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.989735 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb" (OuterVolumeSpecName: "kube-api-access-dx4lb") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "kube-api-access-dx4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:07 crc kubenswrapper[4895]: I1202 07:46:07.991612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts" (OuterVolumeSpecName: "scripts") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.033233 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.084811 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.084863 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4lb\" (UniqueName: \"kubernetes.io/projected/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-kube-api-access-dx4lb\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.084878 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.090432 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.131624 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data" (OuterVolumeSpecName: "config-data") pod "1418eba9-bbbe-44f2-bb65-7081c8e0f25f" (UID: "1418eba9-bbbe-44f2-bb65-7081c8e0f25f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.187670 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.187721 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1418eba9-bbbe-44f2-bb65-7081c8e0f25f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.607795 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1418eba9-bbbe-44f2-bb65-7081c8e0f25f","Type":"ContainerDied","Data":"252063a448124a283c7f67c6f4d9291c5a9b4dde9568f6541e5a20aee8c80c47"} Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.608003 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.608492 4895 scope.go:117] "RemoveContainer" containerID="861f03ad651fa43aee376705baf3d11ccb4c41a4ac00fed84a1660f83a957992" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.608475 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.662047 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.678831 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.694280 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:08 crc kubenswrapper[4895]: E1202 07:46:08.695072 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-central-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.695172 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-central-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: E1202 07:46:08.695231 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-notification-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.695284 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-notification-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: E1202 07:46:08.695345 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="proxy-httpd" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.695393 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="proxy-httpd" Dec 02 07:46:08 crc kubenswrapper[4895]: E1202 07:46:08.695466 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="sg-core" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.695534 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="sg-core" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.698275 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="proxy-httpd" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.698355 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="sg-core" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.698389 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-central-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.698406 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" containerName="ceilometer-notification-agent" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.701146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.708223 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.708388 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.720486 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800061 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2wr\" (UniqueName: \"kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800242 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800384 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800564 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.800799 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2wr\" (UniqueName: \"kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902962 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.902983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.903008 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.904359 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.904397 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.910310 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.911032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.912498 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.918500 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:08 crc kubenswrapper[4895]: I1202 07:46:08.922497 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2wr\" (UniqueName: \"kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr\") pod \"ceilometer-0\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " pod="openstack/ceilometer-0" Dec 02 07:46:09 crc kubenswrapper[4895]: I1202 07:46:09.026403 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:09 crc kubenswrapper[4895]: I1202 07:46:09.178224 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1418eba9-bbbe-44f2-bb65-7081c8e0f25f" path="/var/lib/kubelet/pods/1418eba9-bbbe-44f2-bb65-7081c8e0f25f/volumes" Dec 02 07:46:11 crc kubenswrapper[4895]: I1202 07:46:11.954467 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:14 crc kubenswrapper[4895]: I1202 07:46:14.317766 4895 scope.go:117] "RemoveContainer" containerID="aab79fb37dce1fea7b6d2b53ac3b6a01bb5b1ed5f2e2a6dbbd27161499d2c679" Dec 02 07:46:14 crc kubenswrapper[4895]: I1202 07:46:14.417948 4895 scope.go:117] "RemoveContainer" containerID="91d73afad30f868a96875b2c1fc42a6c1c98b6a198695f9963c6980bee5da657" Dec 02 07:46:14 crc kubenswrapper[4895]: I1202 07:46:14.449130 4895 scope.go:117] "RemoveContainer" containerID="acd8159f31b6a2b7877e0052baa193e5ccf1701251075ff824f4e2e31674f969" Dec 02 07:46:14 crc kubenswrapper[4895]: I1202 07:46:14.961022 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:14 crc kubenswrapper[4895]: W1202 07:46:14.962632 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61d371ba_ba51_4c1e_98ad_bd93f4bf2751.slice/crio-08370f967c1be9a08ddadb22c88de110b585bde34b10c6a13c806fc2d922aa9e WatchSource:0}: Error finding container 08370f967c1be9a08ddadb22c88de110b585bde34b10c6a13c806fc2d922aa9e: Status 404 returned error can't find the container with id 08370f967c1be9a08ddadb22c88de110b585bde34b10c6a13c806fc2d922aa9e Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.426565 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.430638 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.703380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerStarted","Data":"08370f967c1be9a08ddadb22c88de110b585bde34b10c6a13c806fc2d922aa9e"} Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.707790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b52e937-5b7e-4179-9766-20a9c2f93e35","Type":"ContainerStarted","Data":"12ca3ab2f0b64acec9c85ff2dcb3769838a447a3fd301eb6eff49f9f575c5ccb"} Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.941774 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.023190639 podStartE2EDuration="15.94172678s" podCreationTimestamp="2025-12-02 07:46:00 +0000 UTC" firstStartedPulling="2025-12-02 07:46:01.535949605 +0000 UTC m=+1372.706809218" lastFinishedPulling="2025-12-02 07:46:14.454485746 +0000 UTC m=+1385.625345359" observedRunningTime="2025-12-02 07:46:15.739923537 +0000 UTC m=+1386.910783150" watchObservedRunningTime="2025-12-02 07:46:15.94172678 +0000 UTC m=+1387.112586393" Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.952713 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.953110 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-log" containerID="cri-o://a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67" gracePeriod=30 Dec 02 07:46:15 crc kubenswrapper[4895]: I1202 07:46:15.953332 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-httpd" containerID="cri-o://39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e" gracePeriod=30 Dec 02 07:46:16 crc kubenswrapper[4895]: I1202 07:46:16.723909 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerID="a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67" exitCode=143 Dec 02 07:46:16 crc kubenswrapper[4895]: I1202 07:46:16.724124 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerDied","Data":"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67"} Dec 02 07:46:16 crc kubenswrapper[4895]: I1202 07:46:16.728577 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerStarted","Data":"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905"} Dec 02 07:46:16 crc kubenswrapper[4895]: I1202 07:46:16.728629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerStarted","Data":"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850"} Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.138415 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.138765 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-log" containerID="cri-o://e60d35c8b56f6319e4e9e6ca44287e73a7f3f042c2e5330d10f68d04d703572e" gracePeriod=30 Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.138935 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-httpd" containerID="cri-o://036f9ade09b808d7661bab5c17d24da5c8e38d6318f235d5349c9d48e757bb70" gracePeriod=30 Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.744563 4895 generic.go:334] "Generic (PLEG): container finished" podID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerID="e60d35c8b56f6319e4e9e6ca44287e73a7f3f042c2e5330d10f68d04d703572e" exitCode=143 Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.744722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerDied","Data":"e60d35c8b56f6319e4e9e6ca44287e73a7f3f042c2e5330d10f68d04d703572e"} Dec 02 07:46:17 crc kubenswrapper[4895]: I1202 07:46:17.749199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerStarted","Data":"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa"} Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.623014 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666193 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666321 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666425 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvks4\" (UniqueName: \"kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666582 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666608 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666663 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.666728 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle\") pod \"e3768c25-d6e0-4d93-a8c9-6b869977f267\" (UID: \"e3768c25-d6e0-4d93-a8c9-6b869977f267\") " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.668342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs" (OuterVolumeSpecName: "logs") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.668677 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.675267 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts" (OuterVolumeSpecName: "scripts") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.675294 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4" (OuterVolumeSpecName: "kube-api-access-fvks4") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "kube-api-access-fvks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.676403 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.717258 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.732571 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data" (OuterVolumeSpecName: "config-data") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.761082 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e3768c25-d6e0-4d93-a8c9-6b869977f267" (UID: "e3768c25-d6e0-4d93-a8c9-6b869977f267"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.770939 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.770996 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771011 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771022 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3768c25-d6e0-4d93-a8c9-6b869977f267-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771034 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvks4\" (UniqueName: \"kubernetes.io/projected/e3768c25-d6e0-4d93-a8c9-6b869977f267-kube-api-access-fvks4\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771045 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771054 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.771064 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3768c25-d6e0-4d93-a8c9-6b869977f267-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.798141 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.807435 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerID="39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e" exitCode=0 Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.807846 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerDied","Data":"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e"} Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.807918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e3768c25-d6e0-4d93-a8c9-6b869977f267","Type":"ContainerDied","Data":"7b65a580e0a793a2de615a15be44a27168b9ff98b0fab2b9b479c8dde0d62a0a"} Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.807951 4895 scope.go:117] "RemoveContainer" containerID="39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.808225 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.818075 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerStarted","Data":"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858"} Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.818320 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-central-agent" containerID="cri-o://3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850" gracePeriod=30 Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.818657 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.819007 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="proxy-httpd" containerID="cri-o://50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858" gracePeriod=30 Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.819067 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="sg-core" containerID="cri-o://c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa" gracePeriod=30 Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.819113 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-notification-agent" containerID="cri-o://646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905" gracePeriod=30 Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.878275 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.907973 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.017718524 podStartE2EDuration="11.907943923s" podCreationTimestamp="2025-12-02 07:46:08 +0000 UTC" firstStartedPulling="2025-12-02 07:46:14.966266678 +0000 UTC m=+1386.137126291" lastFinishedPulling="2025-12-02 07:46:18.856492087 +0000 UTC m=+1390.027351690" observedRunningTime="2025-12-02 07:46:19.850248723 +0000 UTC m=+1391.021108336" watchObservedRunningTime="2025-12-02 07:46:19.907943923 +0000 UTC m=+1391.078803536" Dec 02 07:46:19 crc kubenswrapper[4895]: I1202 07:46:19.959173 4895 scope.go:117] "RemoveContainer" containerID="a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.029673 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.048405 4895 scope.go:117] "RemoveContainer" containerID="39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.054940 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:20 crc kubenswrapper[4895]: E1202 07:46:20.055165 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e\": container with ID starting with 39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e not found: ID does not exist" containerID="39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.055214 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e"} err="failed to get container status \"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e\": rpc error: code = NotFound desc = could not find container \"39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e\": container with ID starting with 39531e98c51dd7763778d471934687a95420f617360bba0a8c74e13306b7bc1e not found: ID does not exist" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.055242 4895 scope.go:117] "RemoveContainer" containerID="a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.064309 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:20 crc kubenswrapper[4895]: E1202 07:46:20.064970 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-httpd" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.064987 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-httpd" Dec 02 07:46:20 crc kubenswrapper[4895]: E1202 07:46:20.065000 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-log" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.065006 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-log" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.065211 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-httpd" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.065240 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" containerName="glance-log" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.066442 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: E1202 07:46:20.069025 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67\": container with ID starting with a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67 not found: ID does not exist" containerID="a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.069062 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67"} err="failed to get container status \"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67\": rpc error: code = NotFound desc = could not find container \"a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67\": container with ID starting with a571b1549c1970d506653fb10b7f5f596f0318794b3aff01ecdce75169a33e67 not found: ID does not exist" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.070150 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.070196 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.087024 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tlgbq"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.096266 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.150157 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.180968 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tlgbq"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.191791 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-23cb-account-create-update-7svkf"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194335 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194430 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194465 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194486 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4lr\" (UniqueName: \"kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194556 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.194861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.195123 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.197380 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.205678 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zl6qf"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.209950 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.222401 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23cb-account-create-update-7svkf"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.240349 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zl6qf"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297053 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297125 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297232 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7fg\" (UniqueName: \"kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297357 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297425 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297615 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297732 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297770 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4lr\" (UniqueName: \"kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhf4q\" (UniqueName: \"kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297860 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.297957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.298008 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.298036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.298479 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.299987 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.301585 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.310468 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.321029 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.324934 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.325277 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.353891 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qhr8x"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.355414 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.361378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4lr\" (UniqueName: \"kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.362348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.383438 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b7d1-account-create-update-sqm7t"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.385365 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.391278 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.397886 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qhr8x"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.399998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.400045 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.400120 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7fg\" (UniqueName: \"kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.400166 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.400250 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.400310 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhf4q\" (UniqueName: \"kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.401540 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.401566 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.405018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.408146 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.413499 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b7d1-account-create-update-sqm7t"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.439578 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb\") pod \"nova-api-db-create-tlgbq\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.439641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhf4q\" (UniqueName: \"kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q\") pod \"nova-api-23cb-account-create-update-7svkf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.439717 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7fg\" (UniqueName: \"kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg\") pod \"nova-cell0-db-create-zl6qf\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.486015 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f3a9-account-create-update-bfmns"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.487895 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.501601 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.501700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvn8\" (UniqueName: \"kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.501772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8gg\" (UniqueName: \"kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.501832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.505648 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f3a9-account-create-update-bfmns"] Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.523305 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.524020 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.592600 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603341 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603426 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603456 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8gk\" (UniqueName: \"kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvn8\" (UniqueName: \"kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.603597 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8gg\" (UniqueName: \"kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.605040 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.605546 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.627165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvn8\" (UniqueName: \"kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8\") pod \"nova-cell1-db-create-qhr8x\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.628810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8gg\" (UniqueName: \"kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg\") pod \"nova-cell0-b7d1-account-create-update-sqm7t\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.705194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.705694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8gk\" (UniqueName: \"kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.706813 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.724685 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.727312 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8gk\" (UniqueName: \"kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk\") pod \"nova-cell1-f3a9-account-create-update-bfmns\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.808462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.839693 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.845419 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.880386 4895 generic.go:334] "Generic (PLEG): container finished" podID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerID="036f9ade09b808d7661bab5c17d24da5c8e38d6318f235d5349c9d48e757bb70" exitCode=0 Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.880541 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerDied","Data":"036f9ade09b808d7661bab5c17d24da5c8e38d6318f235d5349c9d48e757bb70"} Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926225 4895 generic.go:334] "Generic (PLEG): container finished" podID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerID="50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858" exitCode=0 Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926285 4895 generic.go:334] "Generic (PLEG): container finished" podID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerID="c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa" exitCode=2 Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926297 4895 generic.go:334] "Generic (PLEG): container finished" podID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerID="646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905" exitCode=0 Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926404 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerDied","Data":"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858"} Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926463 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerDied","Data":"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa"} Dec 02 07:46:20 crc kubenswrapper[4895]: I1202 07:46:20.926479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerDied","Data":"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.134318 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.203679 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3768c25-d6e0-4d93-a8c9-6b869977f267" path="/var/lib/kubelet/pods/e3768c25-d6e0-4d93-a8c9-6b869977f267/volumes" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224190 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224257 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b2j5\" (UniqueName: \"kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224415 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224459 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224550 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224631 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224732 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.224817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs\") pod \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\" (UID: \"21dea3da-8ebc-4b04-86fa-19a539bd6cc9\") " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.231639 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.231925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs" (OuterVolumeSpecName: "logs") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.237153 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.243168 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5" (OuterVolumeSpecName: "kube-api-access-2b2j5") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "kube-api-access-2b2j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.244007 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts" (OuterVolumeSpecName: "scripts") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.300667 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.304980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data" (OuterVolumeSpecName: "config-data") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.327032 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334498 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334544 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334557 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b2j5\" (UniqueName: \"kubernetes.io/projected/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-kube-api-access-2b2j5\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334570 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334579 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334587 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.334611 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.354200 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23cb-account-create-update-7svkf"] Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.358381 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.380379 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "21dea3da-8ebc-4b04-86fa-19a539bd6cc9" (UID: "21dea3da-8ebc-4b04-86fa-19a539bd6cc9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.437568 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21dea3da-8ebc-4b04-86fa-19a539bd6cc9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.438224 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.572302 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zl6qf"] Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.592944 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tlgbq"] Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.619613 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qhr8x"] Dec 02 07:46:21 crc kubenswrapper[4895]: W1202 07:46:21.649582 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0df612e_785b_404d_b9ef_21c1ee57b14a.slice/crio-6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b WatchSource:0}: Error finding container 6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b: Status 404 returned error can't find the container with id 6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.717192 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b7d1-account-create-update-sqm7t"] Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.733506 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f3a9-account-create-update-bfmns"] Dec 02 07:46:21 crc kubenswrapper[4895]: W1202 07:46:21.742343 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea719270_c425_4d8b_8717_6c47a5556302.slice/crio-7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20 WatchSource:0}: Error finding container 7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20: Status 404 returned error can't find the container with id 7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20 Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.957060 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"21dea3da-8ebc-4b04-86fa-19a539bd6cc9","Type":"ContainerDied","Data":"2979ac525ad48c4421c35ec93026a9445beae6a73f9d8b0c5640d3d7542b2801"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.957127 4895 scope.go:117] "RemoveContainer" containerID="036f9ade09b808d7661bab5c17d24da5c8e38d6318f235d5349c9d48e757bb70" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.957308 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.963316 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" event={"ID":"ea719270-c425-4d8b-8717-6c47a5556302","Type":"ContainerStarted","Data":"7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.965494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zl6qf" event={"ID":"68666f08-2df0-4f46-a22c-9f33cfb65732","Type":"ContainerStarted","Data":"64d55fa59ae42c2b94eedd5c0718c32785d6d8f8fd9e60167c590468901ed0c0"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.965529 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zl6qf" event={"ID":"68666f08-2df0-4f46-a22c-9f33cfb65732","Type":"ContainerStarted","Data":"02614cd3c2b7b8bb81cc09e7c76d80c2dad1b73fc0b288a680b146e8e76547b5"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.966670 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerStarted","Data":"9238f65472050ea35e994123a598f3005e48f17205c9891944f602d0eee17fa9"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.970483 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tlgbq" event={"ID":"cb4ea5bf-abb5-4fc6-887a-46f19eee6493","Type":"ContainerStarted","Data":"0d07a69fcf10e361c7cfe5b4d7e37de30f83587097e9599939b34745d3e3d8ea"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.977154 4895 generic.go:334] "Generic (PLEG): container finished" podID="c880aad7-a43f-45d7-b7cc-b9252d06eadf" containerID="bb176098e0c7a61181ec4600276b01e97b71134a0d909bf6ea15be259cecec59" exitCode=0 Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.977244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23cb-account-create-update-7svkf" event={"ID":"c880aad7-a43f-45d7-b7cc-b9252d06eadf","Type":"ContainerDied","Data":"bb176098e0c7a61181ec4600276b01e97b71134a0d909bf6ea15be259cecec59"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.977283 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23cb-account-create-update-7svkf" event={"ID":"c880aad7-a43f-45d7-b7cc-b9252d06eadf","Type":"ContainerStarted","Data":"94b9c7caea53ad66dcff0c97059e100416ca04c928ececb9bd8e872e6275730d"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.980237 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" event={"ID":"393335e8-25d7-4364-ae52-eab9ac0d3fa0","Type":"ContainerStarted","Data":"60ca1d390c04daf28205846a18c69de0646817c06e0fd8eb20f5d7769044be5b"} Dec 02 07:46:21 crc kubenswrapper[4895]: I1202 07:46:21.981460 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhr8x" event={"ID":"e0df612e-785b-404d-b9ef-21c1ee57b14a","Type":"ContainerStarted","Data":"6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b"} Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.018514 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-zl6qf" podStartSLOduration=2.018487229 podStartE2EDuration="2.018487229s" podCreationTimestamp="2025-12-02 07:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:46:21.990275339 +0000 UTC m=+1393.161134942" watchObservedRunningTime="2025-12-02 07:46:22.018487229 +0000 UTC m=+1393.189346842" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.177805 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.192500 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.197414 4895 scope.go:117] "RemoveContainer" containerID="e60d35c8b56f6319e4e9e6ca44287e73a7f3f042c2e5330d10f68d04d703572e" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.240865 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:22 crc kubenswrapper[4895]: E1202 07:46:22.241578 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-httpd" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.241596 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-httpd" Dec 02 07:46:22 crc kubenswrapper[4895]: E1202 07:46:22.241614 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-log" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.241621 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-log" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.242213 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-log" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.242243 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" containerName="glance-httpd" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.246009 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.252490 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.252732 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.293861 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370276 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370337 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370463 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370535 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370640 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.370770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssqkl\" (UniqueName: \"kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.473438 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.474510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.474876 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.475126 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.475232 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.475333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssqkl\" (UniqueName: \"kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.475604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.475756 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.479030 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.484317 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.474234 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.497389 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.498921 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.500067 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.517491 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.518990 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssqkl\" (UniqueName: \"kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.537096 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " pod="openstack/glance-default-internal-api-0" Dec 02 07:46:22 crc kubenswrapper[4895]: I1202 07:46:22.672631 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.020410 4895 generic.go:334] "Generic (PLEG): container finished" podID="ea719270-c425-4d8b-8717-6c47a5556302" containerID="19a1d8c117923ca651ad11ad738b188337a1af0826d51b2eb118181006dd5479" exitCode=0 Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.020629 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" event={"ID":"ea719270-c425-4d8b-8717-6c47a5556302","Type":"ContainerDied","Data":"19a1d8c117923ca651ad11ad738b188337a1af0826d51b2eb118181006dd5479"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.032761 4895 generic.go:334] "Generic (PLEG): container finished" podID="393335e8-25d7-4364-ae52-eab9ac0d3fa0" containerID="23cadf09ae70804eb20adc9739731c7b4ef414d2337accf34d11dc986b7b6ba7" exitCode=0 Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.032949 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" event={"ID":"393335e8-25d7-4364-ae52-eab9ac0d3fa0","Type":"ContainerDied","Data":"23cadf09ae70804eb20adc9739731c7b4ef414d2337accf34d11dc986b7b6ba7"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.037714 4895 generic.go:334] "Generic (PLEG): container finished" podID="e0df612e-785b-404d-b9ef-21c1ee57b14a" containerID="d5777f068b1e0673ef51659de82f8858e811deeeea16f976ccf1ba303e0272c4" exitCode=0 Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.037842 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhr8x" event={"ID":"e0df612e-785b-404d-b9ef-21c1ee57b14a","Type":"ContainerDied","Data":"d5777f068b1e0673ef51659de82f8858e811deeeea16f976ccf1ba303e0272c4"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.048116 4895 generic.go:334] "Generic (PLEG): container finished" podID="cb4ea5bf-abb5-4fc6-887a-46f19eee6493" containerID="f274cc78e83e7f731660b694da5330a7d62b23969ffd44e4119df3815dcb2352" exitCode=0 Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.050194 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tlgbq" event={"ID":"cb4ea5bf-abb5-4fc6-887a-46f19eee6493","Type":"ContainerDied","Data":"f274cc78e83e7f731660b694da5330a7d62b23969ffd44e4119df3815dcb2352"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.081829 4895 generic.go:334] "Generic (PLEG): container finished" podID="68666f08-2df0-4f46-a22c-9f33cfb65732" containerID="64d55fa59ae42c2b94eedd5c0718c32785d6d8f8fd9e60167c590468901ed0c0" exitCode=0 Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.081957 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zl6qf" event={"ID":"68666f08-2df0-4f46-a22c-9f33cfb65732","Type":"ContainerDied","Data":"64d55fa59ae42c2b94eedd5c0718c32785d6d8f8fd9e60167c590468901ed0c0"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.091601 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerStarted","Data":"4362e47d57c98a2bd4e29c4d3aa4369c5e901649c4440c8bab3af675617ff778"} Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.167365 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21dea3da-8ebc-4b04-86fa-19a539bd6cc9" path="/var/lib/kubelet/pods/21dea3da-8ebc-4b04-86fa-19a539bd6cc9/volumes" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.440880 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.641172 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.818861 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts\") pod \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.819577 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhf4q\" (UniqueName: \"kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q\") pod \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\" (UID: \"c880aad7-a43f-45d7-b7cc-b9252d06eadf\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.819947 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c880aad7-a43f-45d7-b7cc-b9252d06eadf" (UID: "c880aad7-a43f-45d7-b7cc-b9252d06eadf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.820363 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c880aad7-a43f-45d7-b7cc-b9252d06eadf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.830360 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q" (OuterVolumeSpecName: "kube-api-access-jhf4q") pod "c880aad7-a43f-45d7-b7cc-b9252d06eadf" (UID: "c880aad7-a43f-45d7-b7cc-b9252d06eadf"). InnerVolumeSpecName "kube-api-access-jhf4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.871394 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923272 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923346 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923496 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923528 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr2wr\" (UniqueName: \"kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923590 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.923671 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle\") pod \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\" (UID: \"61d371ba-ba51-4c1e-98ad-bd93f4bf2751\") " Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.924018 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhf4q\" (UniqueName: \"kubernetes.io/projected/c880aad7-a43f-45d7-b7cc-b9252d06eadf-kube-api-access-jhf4q\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.924922 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.925267 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.932526 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr" (OuterVolumeSpecName: "kube-api-access-rr2wr") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "kube-api-access-rr2wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.934526 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts" (OuterVolumeSpecName: "scripts") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:23 crc kubenswrapper[4895]: I1202 07:46:23.978962 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.026264 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr2wr\" (UniqueName: \"kubernetes.io/projected/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-kube-api-access-rr2wr\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.026299 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.026311 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.026320 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.026332 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.044780 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.057806 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data" (OuterVolumeSpecName: "config-data") pod "61d371ba-ba51-4c1e-98ad-bd93f4bf2751" (UID: "61d371ba-ba51-4c1e-98ad-bd93f4bf2751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.128514 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23cb-account-create-update-7svkf" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.128507 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23cb-account-create-update-7svkf" event={"ID":"c880aad7-a43f-45d7-b7cc-b9252d06eadf","Type":"ContainerDied","Data":"94b9c7caea53ad66dcff0c97059e100416ca04c928ececb9bd8e872e6275730d"} Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.129712 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b9c7caea53ad66dcff0c97059e100416ca04c928ececb9bd8e872e6275730d" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.132079 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.132121 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d371ba-ba51-4c1e-98ad-bd93f4bf2751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.132497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerStarted","Data":"af259c450ee7d7673b0fbe89cc10ca606d9ab65f5a9afd56072b19e32ed4be8c"} Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.137135 4895 generic.go:334] "Generic (PLEG): container finished" podID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerID="3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850" exitCode=0 Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.137225 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerDied","Data":"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850"} Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.137244 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.137290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"61d371ba-ba51-4c1e-98ad-bd93f4bf2751","Type":"ContainerDied","Data":"08370f967c1be9a08ddadb22c88de110b585bde34b10c6a13c806fc2d922aa9e"} Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.137322 4895 scope.go:117] "RemoveContainer" containerID="50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.145985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerStarted","Data":"2404d0d162ba97497121e295a4d0041b66d86ff11fa14a769019cf11872671c2"} Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.187612 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.187589161 podStartE2EDuration="5.187589161s" podCreationTimestamp="2025-12-02 07:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:46:24.17488525 +0000 UTC m=+1395.345744903" watchObservedRunningTime="2025-12-02 07:46:24.187589161 +0000 UTC m=+1395.358448774" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.230346 4895 scope.go:117] "RemoveContainer" containerID="c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.232141 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.249021 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.269819 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.270700 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-central-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.270731 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-central-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.270785 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c880aad7-a43f-45d7-b7cc-b9252d06eadf" containerName="mariadb-account-create-update" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.270801 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c880aad7-a43f-45d7-b7cc-b9252d06eadf" containerName="mariadb-account-create-update" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.270846 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="sg-core" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.270859 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="sg-core" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.270882 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-notification-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.270895 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-notification-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.270932 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="proxy-httpd" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.270945 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="proxy-httpd" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.271298 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-central-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.271347 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="ceilometer-notification-agent" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.271379 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c880aad7-a43f-45d7-b7cc-b9252d06eadf" containerName="mariadb-account-create-update" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.271400 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="proxy-httpd" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.271416 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" containerName="sg-core" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.273637 4895 scope.go:117] "RemoveContainer" containerID="646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.274807 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.279101 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.279459 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.295075 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.324576 4895 scope.go:117] "RemoveContainer" containerID="3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.402085 4895 scope.go:117] "RemoveContainer" containerID="50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.403907 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858\": container with ID starting with 50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858 not found: ID does not exist" containerID="50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.403960 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858"} err="failed to get container status \"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858\": rpc error: code = NotFound desc = could not find container \"50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858\": container with ID starting with 50af25234475432ce53fb05eadab7a874939726263fe09b0e0f385dc8b5ee858 not found: ID does not exist" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.403989 4895 scope.go:117] "RemoveContainer" containerID="c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.406618 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa\": container with ID starting with c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa not found: ID does not exist" containerID="c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.406680 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa"} err="failed to get container status \"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa\": rpc error: code = NotFound desc = could not find container \"c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa\": container with ID starting with c020cb920b6af9892b763998844fc021015b1dc863011f2520f11781393d88fa not found: ID does not exist" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.406728 4895 scope.go:117] "RemoveContainer" containerID="646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.407524 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905\": container with ID starting with 646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905 not found: ID does not exist" containerID="646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.407553 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905"} err="failed to get container status \"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905\": rpc error: code = NotFound desc = could not find container \"646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905\": container with ID starting with 646b39ae7dcf06b620966c948a0ce0e376001257cad1f338a1f9a384b8f22905 not found: ID does not exist" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.407571 4895 scope.go:117] "RemoveContainer" containerID="3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850" Dec 02 07:46:24 crc kubenswrapper[4895]: E1202 07:46:24.408903 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850\": container with ID starting with 3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850 not found: ID does not exist" containerID="3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.408989 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850"} err="failed to get container status \"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850\": rpc error: code = NotFound desc = could not find container \"3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850\": container with ID starting with 3827e3a4145a9aa54a19de6c11fffeeee5f0e5031057668f949e29e9e69a0850 not found: ID does not exist" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452329 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452551 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452714 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.452781 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pmm\" (UniqueName: \"kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.453124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555524 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555575 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pmm\" (UniqueName: \"kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555614 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.555768 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.556291 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.557199 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.561174 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.562097 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.562510 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.564034 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.591344 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pmm\" (UniqueName: \"kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm\") pod \"ceilometer-0\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.614948 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.647222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.658548 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts\") pod \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.658619 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb\") pod \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\" (UID: \"cb4ea5bf-abb5-4fc6-887a-46f19eee6493\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.659805 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb4ea5bf-abb5-4fc6-887a-46f19eee6493" (UID: "cb4ea5bf-abb5-4fc6-887a-46f19eee6493"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.669532 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb" (OuterVolumeSpecName: "kube-api-access-kf2bb") pod "cb4ea5bf-abb5-4fc6-887a-46f19eee6493" (UID: "cb4ea5bf-abb5-4fc6-887a-46f19eee6493"). InnerVolumeSpecName "kube-api-access-kf2bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.778213 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.778255 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/cb4ea5bf-abb5-4fc6-887a-46f19eee6493-kube-api-access-kf2bb\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.820685 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.834264 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.889766 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.891499 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk8gk\" (UniqueName: \"kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk\") pod \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.891555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8gg\" (UniqueName: \"kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg\") pod \"ea719270-c425-4d8b-8717-6c47a5556302\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.891607 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts\") pod \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\" (UID: \"393335e8-25d7-4364-ae52-eab9ac0d3fa0\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.891791 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts\") pod \"ea719270-c425-4d8b-8717-6c47a5556302\" (UID: \"ea719270-c425-4d8b-8717-6c47a5556302\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.892879 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea719270-c425-4d8b-8717-6c47a5556302" (UID: "ea719270-c425-4d8b-8717-6c47a5556302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.895454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "393335e8-25d7-4364-ae52-eab9ac0d3fa0" (UID: "393335e8-25d7-4364-ae52-eab9ac0d3fa0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.901096 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk" (OuterVolumeSpecName: "kube-api-access-rk8gk") pod "393335e8-25d7-4364-ae52-eab9ac0d3fa0" (UID: "393335e8-25d7-4364-ae52-eab9ac0d3fa0"). InnerVolumeSpecName "kube-api-access-rk8gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.924515 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg" (OuterVolumeSpecName: "kube-api-access-hn8gg") pod "ea719270-c425-4d8b-8717-6c47a5556302" (UID: "ea719270-c425-4d8b-8717-6c47a5556302"). InnerVolumeSpecName "kube-api-access-hn8gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.928410 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.994220 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts\") pod \"68666f08-2df0-4f46-a22c-9f33cfb65732\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.994390 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts\") pod \"e0df612e-785b-404d-b9ef-21c1ee57b14a\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.994468 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bvn8\" (UniqueName: \"kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8\") pod \"e0df612e-785b-404d-b9ef-21c1ee57b14a\" (UID: \"e0df612e-785b-404d-b9ef-21c1ee57b14a\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.994532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z7fg\" (UniqueName: \"kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg\") pod \"68666f08-2df0-4f46-a22c-9f33cfb65732\" (UID: \"68666f08-2df0-4f46-a22c-9f33cfb65732\") " Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.995051 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk8gk\" (UniqueName: \"kubernetes.io/projected/393335e8-25d7-4364-ae52-eab9ac0d3fa0-kube-api-access-rk8gk\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.995072 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8gg\" (UniqueName: \"kubernetes.io/projected/ea719270-c425-4d8b-8717-6c47a5556302-kube-api-access-hn8gg\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.995084 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/393335e8-25d7-4364-ae52-eab9ac0d3fa0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.995096 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea719270-c425-4d8b-8717-6c47a5556302-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:24 crc kubenswrapper[4895]: I1202 07:46:24.998665 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68666f08-2df0-4f46-a22c-9f33cfb65732" (UID: "68666f08-2df0-4f46-a22c-9f33cfb65732"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.000372 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0df612e-785b-404d-b9ef-21c1ee57b14a" (UID: "e0df612e-785b-404d-b9ef-21c1ee57b14a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.002959 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg" (OuterVolumeSpecName: "kube-api-access-6z7fg") pod "68666f08-2df0-4f46-a22c-9f33cfb65732" (UID: "68666f08-2df0-4f46-a22c-9f33cfb65732"). InnerVolumeSpecName "kube-api-access-6z7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.009143 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8" (OuterVolumeSpecName: "kube-api-access-4bvn8") pod "e0df612e-785b-404d-b9ef-21c1ee57b14a" (UID: "e0df612e-785b-404d-b9ef-21c1ee57b14a"). InnerVolumeSpecName "kube-api-access-4bvn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.097302 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68666f08-2df0-4f46-a22c-9f33cfb65732-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.097910 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0df612e-785b-404d-b9ef-21c1ee57b14a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.097923 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bvn8\" (UniqueName: \"kubernetes.io/projected/e0df612e-785b-404d-b9ef-21c1ee57b14a-kube-api-access-4bvn8\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.097936 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z7fg\" (UniqueName: \"kubernetes.io/projected/68666f08-2df0-4f46-a22c-9f33cfb65732-kube-api-access-6z7fg\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.161383 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d371ba-ba51-4c1e-98ad-bd93f4bf2751" path="/var/lib/kubelet/pods/61d371ba-ba51-4c1e-98ad-bd93f4bf2751/volumes" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.178918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerStarted","Data":"fd8c7d4e19097367de3d3f49094033e0adeb083a5427064f86bcdaba564bc61c"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.182597 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" event={"ID":"ea719270-c425-4d8b-8717-6c47a5556302","Type":"ContainerDied","Data":"7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.182612 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b7d1-account-create-update-sqm7t" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.182634 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7351a4675cea5b033ca78a5b1913b80d9fb3f7a89d4b845c9493a82d3c15ee20" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.187115 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zl6qf" event={"ID":"68666f08-2df0-4f46-a22c-9f33cfb65732","Type":"ContainerDied","Data":"02614cd3c2b7b8bb81cc09e7c76d80c2dad1b73fc0b288a680b146e8e76547b5"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.187166 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02614cd3c2b7b8bb81cc09e7c76d80c2dad1b73fc0b288a680b146e8e76547b5" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.187231 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zl6qf" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.194589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tlgbq" event={"ID":"cb4ea5bf-abb5-4fc6-887a-46f19eee6493","Type":"ContainerDied","Data":"0d07a69fcf10e361c7cfe5b4d7e37de30f83587097e9599939b34745d3e3d8ea"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.194645 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d07a69fcf10e361c7cfe5b4d7e37de30f83587097e9599939b34745d3e3d8ea" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.194656 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tlgbq" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.196438 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.196464 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f3a9-account-create-update-bfmns" event={"ID":"393335e8-25d7-4364-ae52-eab9ac0d3fa0","Type":"ContainerDied","Data":"60ca1d390c04daf28205846a18c69de0646817c06e0fd8eb20f5d7769044be5b"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.196516 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ca1d390c04daf28205846a18c69de0646817c06e0fd8eb20f5d7769044be5b" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.200456 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhr8x" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.200505 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhr8x" event={"ID":"e0df612e-785b-404d-b9ef-21c1ee57b14a","Type":"ContainerDied","Data":"6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b"} Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.200533 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd02747aa215102c4f4979d9ce2819070c19ecc72233297acbcbdffed0e992b" Dec 02 07:46:25 crc kubenswrapper[4895]: I1202 07:46:25.320080 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:26 crc kubenswrapper[4895]: I1202 07:46:26.225198 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerStarted","Data":"973ab025884cab7054f146e0f744a06e1f4e800f6c16521085496ffc96503509"} Dec 02 07:46:26 crc kubenswrapper[4895]: I1202 07:46:26.228279 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerStarted","Data":"54525466bbbcc974ddc2b6ffbb848ae2022d5f5e38c50fb140ead0d9aad2a629"} Dec 02 07:46:26 crc kubenswrapper[4895]: I1202 07:46:26.273900 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.2738724 podStartE2EDuration="4.2738724s" podCreationTimestamp="2025-12-02 07:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:46:26.251942864 +0000 UTC m=+1397.422802487" watchObservedRunningTime="2025-12-02 07:46:26.2738724 +0000 UTC m=+1397.444732033" Dec 02 07:46:27 crc kubenswrapper[4895]: I1202 07:46:27.026407 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:46:27 crc kubenswrapper[4895]: I1202 07:46:27.243521 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerStarted","Data":"b0520efbfddb0b37fbb7a65afbe9383817ccf9f6ae11082d2fa3e2c3a88b743f"} Dec 02 07:46:27 crc kubenswrapper[4895]: I1202 07:46:27.244144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerStarted","Data":"8a759c6911d74b1f1d0481259494f6c68447fd16d6ae68cb602fe2ebde521347"} Dec 02 07:46:28 crc kubenswrapper[4895]: I1202 07:46:28.277329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerStarted","Data":"6ef5e37085909aad803297f2e65887d60d3b2a7265ec5b2edec0715738e2c133"} Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.297927 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerStarted","Data":"6ca835b0b75e3696527a82637f8aa060b70b3d711663ec42e2f269ae07704a6b"} Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.299612 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.298208 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="sg-core" containerID="cri-o://6ef5e37085909aad803297f2e65887d60d3b2a7265ec5b2edec0715738e2c133" gracePeriod=30 Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.298249 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="proxy-httpd" containerID="cri-o://6ca835b0b75e3696527a82637f8aa060b70b3d711663ec42e2f269ae07704a6b" gracePeriod=30 Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.298292 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-notification-agent" containerID="cri-o://b0520efbfddb0b37fbb7a65afbe9383817ccf9f6ae11082d2fa3e2c3a88b743f" gracePeriod=30 Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.298177 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-central-agent" containerID="cri-o://8a759c6911d74b1f1d0481259494f6c68447fd16d6ae68cb602fe2ebde521347" gracePeriod=30 Dec 02 07:46:29 crc kubenswrapper[4895]: I1202 07:46:29.343414 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.618516249 podStartE2EDuration="5.343385909s" podCreationTimestamp="2025-12-02 07:46:24 +0000 UTC" firstStartedPulling="2025-12-02 07:46:25.329165116 +0000 UTC m=+1396.500024729" lastFinishedPulling="2025-12-02 07:46:29.054034776 +0000 UTC m=+1400.224894389" observedRunningTime="2025-12-02 07:46:29.325114636 +0000 UTC m=+1400.495974259" watchObservedRunningTime="2025-12-02 07:46:29.343385909 +0000 UTC m=+1400.514245522" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.322592 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerID="6ef5e37085909aad803297f2e65887d60d3b2a7265ec5b2edec0715738e2c133" exitCode=2 Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.322636 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerID="b0520efbfddb0b37fbb7a65afbe9383817ccf9f6ae11082d2fa3e2c3a88b743f" exitCode=0 Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.322639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerDied","Data":"6ef5e37085909aad803297f2e65887d60d3b2a7265ec5b2edec0715738e2c133"} Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.322698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerDied","Data":"b0520efbfddb0b37fbb7a65afbe9383817ccf9f6ae11082d2fa3e2c3a88b743f"} Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.409108 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.409185 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.448470 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.452868 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.742267 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chxlc"] Dec 02 07:46:30 crc kubenswrapper[4895]: E1202 07:46:30.742916 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4ea5bf-abb5-4fc6-887a-46f19eee6493" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.742947 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4ea5bf-abb5-4fc6-887a-46f19eee6493" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: E1202 07:46:30.742979 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0df612e-785b-404d-b9ef-21c1ee57b14a" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.742989 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0df612e-785b-404d-b9ef-21c1ee57b14a" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: E1202 07:46:30.743027 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393335e8-25d7-4364-ae52-eab9ac0d3fa0" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.743041 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="393335e8-25d7-4364-ae52-eab9ac0d3fa0" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: E1202 07:46:30.743063 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea719270-c425-4d8b-8717-6c47a5556302" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.743076 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea719270-c425-4d8b-8717-6c47a5556302" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: E1202 07:46:30.743101 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68666f08-2df0-4f46-a22c-9f33cfb65732" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.743111 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="68666f08-2df0-4f46-a22c-9f33cfb65732" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.757419 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="68666f08-2df0-4f46-a22c-9f33cfb65732" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.757505 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea719270-c425-4d8b-8717-6c47a5556302" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.757537 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="393335e8-25d7-4364-ae52-eab9ac0d3fa0" containerName="mariadb-account-create-update" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.757570 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4ea5bf-abb5-4fc6-887a-46f19eee6493" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.757615 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0df612e-785b-404d-b9ef-21c1ee57b14a" containerName="mariadb-database-create" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.760187 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.768263 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qp9rm" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.768569 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.779978 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.790569 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chxlc"] Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.876447 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.876631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.876767 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqvt\" (UniqueName: \"kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.876848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.991316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.991492 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqvt\" (UniqueName: \"kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.991574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:30 crc kubenswrapper[4895]: I1202 07:46:30.991628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.005056 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.009426 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.010781 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.024405 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqvt\" (UniqueName: \"kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt\") pod \"nova-cell0-conductor-db-sync-chxlc\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.117270 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.338628 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.339415 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 07:46:31 crc kubenswrapper[4895]: I1202 07:46:31.670886 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chxlc"] Dec 02 07:46:31 crc kubenswrapper[4895]: W1202 07:46:31.673389 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ad0597_da06_43ca_bcbb_03eb78fb8b53.slice/crio-1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00 WatchSource:0}: Error finding container 1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00: Status 404 returned error can't find the container with id 1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00 Dec 02 07:46:32 crc kubenswrapper[4895]: I1202 07:46:32.349490 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chxlc" event={"ID":"f0ad0597-da06-43ca-bcbb-03eb78fb8b53","Type":"ContainerStarted","Data":"1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00"} Dec 02 07:46:32 crc kubenswrapper[4895]: I1202 07:46:32.673905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:32 crc kubenswrapper[4895]: I1202 07:46:32.674251 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:32 crc kubenswrapper[4895]: I1202 07:46:32.724071 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:32 crc kubenswrapper[4895]: I1202 07:46:32.742646 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.367420 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.367451 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.367886 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.367946 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.469404 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 07:46:33 crc kubenswrapper[4895]: I1202 07:46:33.476376 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 07:46:34 crc kubenswrapper[4895]: I1202 07:46:34.386830 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerID="8a759c6911d74b1f1d0481259494f6c68447fd16d6ae68cb602fe2ebde521347" exitCode=0 Dec 02 07:46:34 crc kubenswrapper[4895]: I1202 07:46:34.386907 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerDied","Data":"8a759c6911d74b1f1d0481259494f6c68447fd16d6ae68cb602fe2ebde521347"} Dec 02 07:46:35 crc kubenswrapper[4895]: I1202 07:46:35.442294 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:35 crc kubenswrapper[4895]: I1202 07:46:35.442734 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 07:46:35 crc kubenswrapper[4895]: I1202 07:46:35.624871 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 07:46:41 crc kubenswrapper[4895]: I1202 07:46:41.461118 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chxlc" event={"ID":"f0ad0597-da06-43ca-bcbb-03eb78fb8b53","Type":"ContainerStarted","Data":"b1a1c5160c9558d73203d10d42adf88f8ff05038c8885766513269062a2ce0c0"} Dec 02 07:46:41 crc kubenswrapper[4895]: I1202 07:46:41.493716 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-chxlc" podStartSLOduration=2.771753715 podStartE2EDuration="11.493684938s" podCreationTimestamp="2025-12-02 07:46:30 +0000 UTC" firstStartedPulling="2025-12-02 07:46:31.676773558 +0000 UTC m=+1402.847633171" lastFinishedPulling="2025-12-02 07:46:40.398704781 +0000 UTC m=+1411.569564394" observedRunningTime="2025-12-02 07:46:41.48467444 +0000 UTC m=+1412.655534053" watchObservedRunningTime="2025-12-02 07:46:41.493684938 +0000 UTC m=+1412.664544571" Dec 02 07:46:51 crc kubenswrapper[4895]: I1202 07:46:51.569398 4895 generic.go:334] "Generic (PLEG): container finished" podID="f0ad0597-da06-43ca-bcbb-03eb78fb8b53" containerID="b1a1c5160c9558d73203d10d42adf88f8ff05038c8885766513269062a2ce0c0" exitCode=0 Dec 02 07:46:51 crc kubenswrapper[4895]: I1202 07:46:51.569580 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chxlc" event={"ID":"f0ad0597-da06-43ca-bcbb-03eb78fb8b53","Type":"ContainerDied","Data":"b1a1c5160c9558d73203d10d42adf88f8ff05038c8885766513269062a2ce0c0"} Dec 02 07:46:52 crc kubenswrapper[4895]: I1202 07:46:52.981240 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.105799 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle\") pod \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.105935 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data\") pod \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.106027 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts\") pod \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.106091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbqvt\" (UniqueName: \"kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt\") pod \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\" (UID: \"f0ad0597-da06-43ca-bcbb-03eb78fb8b53\") " Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.115173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts" (OuterVolumeSpecName: "scripts") pod "f0ad0597-da06-43ca-bcbb-03eb78fb8b53" (UID: "f0ad0597-da06-43ca-bcbb-03eb78fb8b53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.115381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt" (OuterVolumeSpecName: "kube-api-access-lbqvt") pod "f0ad0597-da06-43ca-bcbb-03eb78fb8b53" (UID: "f0ad0597-da06-43ca-bcbb-03eb78fb8b53"). InnerVolumeSpecName "kube-api-access-lbqvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.142833 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0ad0597-da06-43ca-bcbb-03eb78fb8b53" (UID: "f0ad0597-da06-43ca-bcbb-03eb78fb8b53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.157622 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data" (OuterVolumeSpecName: "config-data") pod "f0ad0597-da06-43ca-bcbb-03eb78fb8b53" (UID: "f0ad0597-da06-43ca-bcbb-03eb78fb8b53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.207988 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.208038 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbqvt\" (UniqueName: \"kubernetes.io/projected/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-kube-api-access-lbqvt\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.208053 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.208063 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ad0597-da06-43ca-bcbb-03eb78fb8b53-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.597947 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-chxlc" event={"ID":"f0ad0597-da06-43ca-bcbb-03eb78fb8b53","Type":"ContainerDied","Data":"1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00"} Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.598018 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a323f8f67008baf4f8482e15c62d834012f55f63242d088a35d6da61c5c1d00" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.598052 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-chxlc" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.771388 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:46:53 crc kubenswrapper[4895]: E1202 07:46:53.773481 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ad0597-da06-43ca-bcbb-03eb78fb8b53" containerName="nova-cell0-conductor-db-sync" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.773519 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ad0597-da06-43ca-bcbb-03eb78fb8b53" containerName="nova-cell0-conductor-db-sync" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.774015 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ad0597-da06-43ca-bcbb-03eb78fb8b53" containerName="nova-cell0-conductor-db-sync" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.781012 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.785910 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qp9rm" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.813890 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.828192 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.932430 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87kr4\" (UniqueName: \"kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.932505 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:53 crc kubenswrapper[4895]: I1202 07:46:53.932629 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.035606 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.037006 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87kr4\" (UniqueName: \"kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.037140 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.040660 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.042852 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.069173 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87kr4\" (UniqueName: \"kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4\") pod \"nova-cell0-conductor-0\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.147288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.621488 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 02 07:46:54 crc kubenswrapper[4895]: I1202 07:46:54.678521 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:46:54 crc kubenswrapper[4895]: W1202 07:46:54.682798 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a02963_abb5_4f29_aa82_88ba6f859a00.slice/crio-01c44df798ca616a54cc3d96ce4d25a37d45d50a80f4a2d2b8f986f0c1428c2f WatchSource:0}: Error finding container 01c44df798ca616a54cc3d96ce4d25a37d45d50a80f4a2d2b8f986f0c1428c2f: Status 404 returned error can't find the container with id 01c44df798ca616a54cc3d96ce4d25a37d45d50a80f4a2d2b8f986f0c1428c2f Dec 02 07:46:55 crc kubenswrapper[4895]: I1202 07:46:55.623078 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65a02963-abb5-4f29-aa82-88ba6f859a00","Type":"ContainerStarted","Data":"e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd"} Dec 02 07:46:55 crc kubenswrapper[4895]: I1202 07:46:55.623643 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:55 crc kubenswrapper[4895]: I1202 07:46:55.623663 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65a02963-abb5-4f29-aa82-88ba6f859a00","Type":"ContainerStarted","Data":"01c44df798ca616a54cc3d96ce4d25a37d45d50a80f4a2d2b8f986f0c1428c2f"} Dec 02 07:46:55 crc kubenswrapper[4895]: I1202 07:46:55.651190 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.651167046 podStartE2EDuration="2.651167046s" podCreationTimestamp="2025-12-02 07:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:46:55.643013154 +0000 UTC m=+1426.813872767" watchObservedRunningTime="2025-12-02 07:46:55.651167046 +0000 UTC m=+1426.822026669" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.197828 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.694941 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerID="6ca835b0b75e3696527a82637f8aa060b70b3d711663ec42e2f269ae07704a6b" exitCode=137 Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.695454 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerDied","Data":"6ca835b0b75e3696527a82637f8aa060b70b3d711663ec42e2f269ae07704a6b"} Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.695506 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca46fe00-eb61-4baf-81b1-a2b91c754a99","Type":"ContainerDied","Data":"54525466bbbcc974ddc2b6ffbb848ae2022d5f5e38c50fb140ead0d9aad2a629"} Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.695524 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54525466bbbcc974ddc2b6ffbb848ae2022d5f5e38c50fb140ead0d9aad2a629" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.735077 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h9wgm"] Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.736911 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.737702 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.738605 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.739924 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.749680 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9wgm"] Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.765317 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:46:59 crc kubenswrapper[4895]: E1202 07:46:59.765838 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="proxy-httpd" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.765856 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="proxy-httpd" Dec 02 07:46:59 crc kubenswrapper[4895]: E1202 07:46:59.765874 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-central-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.765884 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-central-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: E1202 07:46:59.765919 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-notification-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.765926 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-notification-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: E1202 07:46:59.765940 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="sg-core" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.765946 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="sg-core" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.766133 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="proxy-httpd" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.766150 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="sg-core" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.766173 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-central-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.766186 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" containerName="ceilometer-notification-agent" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.767981 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.780583 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883550 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5pmm\" (UniqueName: \"kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883629 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883654 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883690 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883713 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.883775 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle\") pod \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\" (UID: \"ca46fe00-eb61-4baf-81b1-a2b91c754a99\") " Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjtlt\" (UniqueName: \"kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884142 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884182 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884206 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884262 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884303 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbcm\" (UniqueName: \"kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884407 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.884730 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.902501 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm" (OuterVolumeSpecName: "kube-api-access-q5pmm") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "kube-api-access-q5pmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.902631 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts" (OuterVolumeSpecName: "scripts") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988398 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988523 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbcm\" (UniqueName: \"kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988671 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjtlt\" (UniqueName: \"kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988756 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988964 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988981 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.988993 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca46fe00-eb61-4baf-81b1-a2b91c754a99-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.989004 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5pmm\" (UniqueName: \"kubernetes.io/projected/ca46fe00-eb61-4baf-81b1-a2b91c754a99-kube-api-access-q5pmm\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:59 crc kubenswrapper[4895]: I1202 07:46:59.989570 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:46:59.992593 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:46:59.992948 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:46:59.995795 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:46:59.996389 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:46:59.998717 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.002693 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.009688 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.022513 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.027841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbcm\" (UniqueName: \"kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm\") pod \"redhat-operators-km4tm\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.053108 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjtlt\" (UniqueName: \"kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt\") pod \"nova-cell0-cell-mapping-h9wgm\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.097242 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.097351 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.097441 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62ssm\" (UniqueName: \"kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.097478 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.097538 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.102981 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.113761 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.122840 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.130379 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.141619 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.156410 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.197311 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.200311 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62ssm\" (UniqueName: \"kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.200392 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.200487 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.200538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.200614 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.201080 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.252862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.262958 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.289597 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62ssm\" (UniqueName: \"kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm\") pod \"nova-api-0\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.296547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data" (OuterVolumeSpecName: "config-data") pod "ca46fe00-eb61-4baf-81b1-a2b91c754a99" (UID: "ca46fe00-eb61-4baf-81b1-a2b91c754a99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.298014 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.299821 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.301685 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4jw\" (UniqueName: \"kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.301755 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.301878 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.302104 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca46fe00-eb61-4baf-81b1-a2b91c754a99-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.311840 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.328241 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.345984 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.367733 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.368452 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.378851 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.384991 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.405570 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkrq\" (UniqueName: \"kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.419392 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4jw\" (UniqueName: \"kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.419819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.420286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.420533 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4sbg\" (UniqueName: \"kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.425361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.425427 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.425470 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.425559 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.425617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.436226 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.442689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548580 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkrq\" (UniqueName: \"kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548770 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4sbg\" (UniqueName: \"kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548840 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548865 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.548892 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.549401 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4jw\" (UniqueName: \"kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw\") pod \"nova-scheduler-0\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.564255 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.570328 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.574501 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.591451 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4sbg\" (UniqueName: \"kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.597694 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.625033 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.626204 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.630464 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkrq\" (UniqueName: \"kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.662398 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.664522 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.666246 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.698669 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.705474 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.726485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.737698 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.755782 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hkvh\" (UniqueName: \"kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.755853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.755988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.756014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.756043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.756074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859249 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859322 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859403 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hkvh\" (UniqueName: \"kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859454 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859656 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.859701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.860842 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.861514 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.862316 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.865180 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.865355 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.890123 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.943950 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:00 crc kubenswrapper[4895]: I1202 07:47:00.950923 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hkvh\" (UniqueName: \"kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh\") pod \"dnsmasq-dns-757b4f8459-zhzzg\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.004728 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.012904 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.022034 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.023175 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.057921 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.078725 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079234 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079276 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079365 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079464 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.079514 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx4lr\" (UniqueName: \"kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.081924 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.175672 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca46fe00-eb61-4baf-81b1-a2b91c754a99" path="/var/lib/kubelet/pods/ca46fe00-eb61-4baf-81b1-a2b91c754a99/volumes" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.193508 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.193642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.193788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx4lr\" (UniqueName: \"kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.194012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.194299 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.194335 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.194371 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.195328 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.203284 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.204900 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.205258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.207809 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.224098 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.239771 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx4lr\" (UniqueName: \"kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr\") pod \"ceilometer-0\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.362694 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.544962 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.585688 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.614941 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.634257 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9wgm"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.807190 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerStarted","Data":"25e7965d92ffced6a51dc1ff1f88eef2b4ed9ac7682fd21def6aeea5e86994fe"} Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.823046 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerStarted","Data":"c6256e9b736585193e92d2062e31cd6da7cd9f8a6a4255ea477f4f0319606403"} Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.828212 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9wgm" event={"ID":"4e5d3f70-f931-473a-af3c-e0858a46e311","Type":"ContainerStarted","Data":"2735838beb06df8c15994d68505d185c2efcc5d02191cc86b3dc7d5612f9685f"} Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.852327 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8svml"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.854493 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.858112 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.858286 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.873992 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8svml"] Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.915270 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.915509 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn229\" (UniqueName: \"kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.915677 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:01 crc kubenswrapper[4895]: I1202 07:47:01.916398 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.298674 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.298804 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn229\" (UniqueName: \"kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.298893 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.298960 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.309579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.315627 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.321344 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.350722 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.373332 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn229\" (UniqueName: \"kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229\") pod \"nova-cell1-conductor-db-sync-8svml\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.386896 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.399507 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.449970 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.499962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.583944 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.843051 4895 generic.go:334] "Generic (PLEG): container finished" podID="23d7d649-194e-4822-9ac2-9badcf844980" containerID="562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55" exitCode=0 Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.843533 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerDied","Data":"562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.847769 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" event={"ID":"24560b66-b4b2-4c54-98f4-2dbf30465373","Type":"ContainerStarted","Data":"edaa950854b60d97234a76611f7b13a4389f65fb424dbe1174ecb06f9146263f"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.854360 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e54197c-b432-4f6e-9bd9-ce1f15fde624","Type":"ContainerStarted","Data":"7b806473908ebc9882572a08ca9406b0f79310e09588a651b7891e009f4df689"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.857016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerStarted","Data":"8adf6abaaad38df5ff46aad92e77665d3615f6d5cce4d23efcd19d6162d5f4ad"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.873622 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76dc5ff2-192c-42e5-80ca-17b405814be6","Type":"ContainerStarted","Data":"19108097f3d351fdb527e0c3cdfaebd4166bcda9a933a5416a0455e6549af455"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.887998 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9wgm" event={"ID":"4e5d3f70-f931-473a-af3c-e0858a46e311","Type":"ContainerStarted","Data":"62b374c4faec2438d9bd41034a5738b43a2bf2fbe98618d82d77b24eeb955851"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.893626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerStarted","Data":"b530282eff7a35513a7e9523a1206ab301d90f8c720681bbdfad82daf51a8e46"} Dec 02 07:47:02 crc kubenswrapper[4895]: I1202 07:47:02.924846 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h9wgm" podStartSLOduration=3.924758783 podStartE2EDuration="3.924758783s" podCreationTimestamp="2025-12-02 07:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:02.9155459 +0000 UTC m=+1434.086405533" watchObservedRunningTime="2025-12-02 07:47:02.924758783 +0000 UTC m=+1434.095618406" Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.136943 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8svml"] Dec 02 07:47:03 crc kubenswrapper[4895]: E1202 07:47:03.227228 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24560b66_b4b2_4c54_98f4_2dbf30465373.slice/crio-conmon-5590d81d23ba137905161fc2b52460b69642aadd53f83964f3cd2617bf4b4959.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24560b66_b4b2_4c54_98f4_2dbf30465373.slice/crio-5590d81d23ba137905161fc2b52460b69642aadd53f83964f3cd2617bf4b4959.scope\": RecentStats: unable to find data in memory cache]" Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.914373 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerStarted","Data":"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7"} Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.929415 4895 generic.go:334] "Generic (PLEG): container finished" podID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerID="5590d81d23ba137905161fc2b52460b69642aadd53f83964f3cd2617bf4b4959" exitCode=0 Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.929881 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" event={"ID":"24560b66-b4b2-4c54-98f4-2dbf30465373","Type":"ContainerDied","Data":"5590d81d23ba137905161fc2b52460b69642aadd53f83964f3cd2617bf4b4959"} Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.938593 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8svml" event={"ID":"845f4c45-cc2f-4b99-a42f-3c04b18730fe","Type":"ContainerStarted","Data":"391b022ce7caf5397988776e4babbf6233f29f15977989bdaf58d428c0573bde"} Dec 02 07:47:03 crc kubenswrapper[4895]: I1202 07:47:03.939038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8svml" event={"ID":"845f4c45-cc2f-4b99-a42f-3c04b18730fe","Type":"ContainerStarted","Data":"a0710d417855ab420b0183608216508610497883652cb830574cff3ead83f14a"} Dec 02 07:47:04 crc kubenswrapper[4895]: I1202 07:47:04.004482 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8svml" podStartSLOduration=3.00445129 podStartE2EDuration="3.00445129s" podCreationTimestamp="2025-12-02 07:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:03.996511634 +0000 UTC m=+1435.167371247" watchObservedRunningTime="2025-12-02 07:47:04.00445129 +0000 UTC m=+1435.175310903" Dec 02 07:47:05 crc kubenswrapper[4895]: I1202 07:47:05.202664 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:05 crc kubenswrapper[4895]: I1202 07:47:05.223826 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.976339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerStarted","Data":"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c"} Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.978858 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerStarted","Data":"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84"} Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.994820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" event={"ID":"24560b66-b4b2-4c54-98f4-2dbf30465373","Type":"ContainerStarted","Data":"dcb07c6666b40670ddd6c822e6250b79b03ff7717ddff952acab4a55d8c1719d"} Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.995022 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.999184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e54197c-b432-4f6e-9bd9-ce1f15fde624","Type":"ContainerStarted","Data":"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174"} Dec 02 07:47:06 crc kubenswrapper[4895]: I1202 07:47:06.999355 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174" gracePeriod=30 Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.011381 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerStarted","Data":"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566"} Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.011446 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerStarted","Data":"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d"} Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.013442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerStarted","Data":"0849f6535974db0d69a57bf817fe17b4218a1795d6c923a795bf8726eb1da305"} Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.020520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76dc5ff2-192c-42e5-80ca-17b405814be6","Type":"ContainerStarted","Data":"59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b"} Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.035682 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.314041598 podStartE2EDuration="7.035657798s" podCreationTimestamp="2025-12-02 07:47:00 +0000 UTC" firstStartedPulling="2025-12-02 07:47:02.476556481 +0000 UTC m=+1433.647416094" lastFinishedPulling="2025-12-02 07:47:06.198172681 +0000 UTC m=+1437.369032294" observedRunningTime="2025-12-02 07:47:07.027078403 +0000 UTC m=+1438.197938016" watchObservedRunningTime="2025-12-02 07:47:07.035657798 +0000 UTC m=+1438.206517411" Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.060492 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" podStartSLOduration=7.060462153 podStartE2EDuration="7.060462153s" podCreationTimestamp="2025-12-02 07:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:07.050258399 +0000 UTC m=+1438.221118022" watchObservedRunningTime="2025-12-02 07:47:07.060462153 +0000 UTC m=+1438.231321756" Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.089676 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.508053663 podStartE2EDuration="8.089657534s" podCreationTimestamp="2025-12-02 07:46:59 +0000 UTC" firstStartedPulling="2025-12-02 07:47:01.614637021 +0000 UTC m=+1432.785496634" lastFinishedPulling="2025-12-02 07:47:06.196240892 +0000 UTC m=+1437.367100505" observedRunningTime="2025-12-02 07:47:07.085877187 +0000 UTC m=+1438.256736820" watchObservedRunningTime="2025-12-02 07:47:07.089657534 +0000 UTC m=+1438.260517147" Dec 02 07:47:07 crc kubenswrapper[4895]: I1202 07:47:07.108543 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.409491522 podStartE2EDuration="8.108518125s" podCreationTimestamp="2025-12-02 07:46:59 +0000 UTC" firstStartedPulling="2025-12-02 07:47:02.457036979 +0000 UTC m=+1433.627896602" lastFinishedPulling="2025-12-02 07:47:06.156063592 +0000 UTC m=+1437.326923205" observedRunningTime="2025-12-02 07:47:07.104710837 +0000 UTC m=+1438.275570450" watchObservedRunningTime="2025-12-02 07:47:07.108518125 +0000 UTC m=+1438.279377738" Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.033077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerStarted","Data":"9455075bd664ef262899e3193ae7b3ea348692e7b73156e82ea3a038935266ba"} Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.033085 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-log" containerID="cri-o://0849f6535974db0d69a57bf817fe17b4218a1795d6c923a795bf8726eb1da305" gracePeriod=30 Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.033633 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-metadata" containerID="cri-o://9455075bd664ef262899e3193ae7b3ea348692e7b73156e82ea3a038935266ba" gracePeriod=30 Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.042590 4895 generic.go:334] "Generic (PLEG): container finished" podID="23d7d649-194e-4822-9ac2-9badcf844980" containerID="de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84" exitCode=0 Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.043280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerDied","Data":"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84"} Dec 02 07:47:08 crc kubenswrapper[4895]: I1202 07:47:08.067919 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.335945063 podStartE2EDuration="8.067889201s" podCreationTimestamp="2025-12-02 07:47:00 +0000 UTC" firstStartedPulling="2025-12-02 07:47:02.472711123 +0000 UTC m=+1433.643570736" lastFinishedPulling="2025-12-02 07:47:06.204655271 +0000 UTC m=+1437.375514874" observedRunningTime="2025-12-02 07:47:08.058919234 +0000 UTC m=+1439.229778907" watchObservedRunningTime="2025-12-02 07:47:08.067889201 +0000 UTC m=+1439.238748814" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.058716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerStarted","Data":"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525"} Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.065000 4895 generic.go:334] "Generic (PLEG): container finished" podID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerID="9455075bd664ef262899e3193ae7b3ea348692e7b73156e82ea3a038935266ba" exitCode=0 Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.065044 4895 generic.go:334] "Generic (PLEG): container finished" podID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerID="0849f6535974db0d69a57bf817fe17b4218a1795d6c923a795bf8726eb1da305" exitCode=143 Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.065072 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerDied","Data":"9455075bd664ef262899e3193ae7b3ea348692e7b73156e82ea3a038935266ba"} Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.065144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerDied","Data":"0849f6535974db0d69a57bf817fe17b4218a1795d6c923a795bf8726eb1da305"} Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.200793 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.370442 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data\") pod \"b79289dc-3eed-4009-985d-03e6a8fd36ce\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.370967 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4sbg\" (UniqueName: \"kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg\") pod \"b79289dc-3eed-4009-985d-03e6a8fd36ce\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.371118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle\") pod \"b79289dc-3eed-4009-985d-03e6a8fd36ce\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.371287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs\") pod \"b79289dc-3eed-4009-985d-03e6a8fd36ce\" (UID: \"b79289dc-3eed-4009-985d-03e6a8fd36ce\") " Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.372444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs" (OuterVolumeSpecName: "logs") pod "b79289dc-3eed-4009-985d-03e6a8fd36ce" (UID: "b79289dc-3eed-4009-985d-03e6a8fd36ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.397114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg" (OuterVolumeSpecName: "kube-api-access-h4sbg") pod "b79289dc-3eed-4009-985d-03e6a8fd36ce" (UID: "b79289dc-3eed-4009-985d-03e6a8fd36ce"). InnerVolumeSpecName "kube-api-access-h4sbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.416821 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b79289dc-3eed-4009-985d-03e6a8fd36ce" (UID: "b79289dc-3eed-4009-985d-03e6a8fd36ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.418616 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data" (OuterVolumeSpecName: "config-data") pod "b79289dc-3eed-4009-985d-03e6a8fd36ce" (UID: "b79289dc-3eed-4009-985d-03e6a8fd36ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.474122 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b79289dc-3eed-4009-985d-03e6a8fd36ce-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.474162 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.474173 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4sbg\" (UniqueName: \"kubernetes.io/projected/b79289dc-3eed-4009-985d-03e6a8fd36ce-kube-api-access-h4sbg\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:09 crc kubenswrapper[4895]: I1202 07:47:09.474183 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79289dc-3eed-4009-985d-03e6a8fd36ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.097242 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerStarted","Data":"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b"} Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.100581 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b79289dc-3eed-4009-985d-03e6a8fd36ce","Type":"ContainerDied","Data":"8adf6abaaad38df5ff46aad92e77665d3615f6d5cce4d23efcd19d6162d5f4ad"} Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.100632 4895 scope.go:117] "RemoveContainer" containerID="9455075bd664ef262899e3193ae7b3ea348692e7b73156e82ea3a038935266ba" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.100865 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.125340 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.125394 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.137870 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-km4tm" podStartSLOduration=5.153123695 podStartE2EDuration="11.137846326s" podCreationTimestamp="2025-12-02 07:46:59 +0000 UTC" firstStartedPulling="2025-12-02 07:47:02.847733108 +0000 UTC m=+1434.018592731" lastFinishedPulling="2025-12-02 07:47:08.832455739 +0000 UTC m=+1440.003315362" observedRunningTime="2025-12-02 07:47:10.125392901 +0000 UTC m=+1441.296252524" watchObservedRunningTime="2025-12-02 07:47:10.137846326 +0000 UTC m=+1441.308705959" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.222540 4895 scope.go:117] "RemoveContainer" containerID="0849f6535974db0d69a57bf817fe17b4218a1795d6c923a795bf8726eb1da305" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.335147 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.356452 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.372879 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:10 crc kubenswrapper[4895]: E1202 07:47:10.373568 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-metadata" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.373587 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-metadata" Dec 02 07:47:10 crc kubenswrapper[4895]: E1202 07:47:10.373621 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-log" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.373630 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-log" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.373905 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-log" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.373929 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" containerName="nova-metadata-metadata" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.375403 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.382228 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.382471 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.383084 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.389415 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.389558 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.554676 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66gw\" (UniqueName: \"kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.555859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.555989 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.556161 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.556616 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.576093 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.576385 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.613940 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.659440 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66gw\" (UniqueName: \"kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.659569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.659628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.659687 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.659708 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.660510 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.665324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.665614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.678612 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.682924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66gw\" (UniqueName: \"kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw\") pod \"nova-metadata-0\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " pod="openstack/nova-metadata-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.727254 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:10 crc kubenswrapper[4895]: I1202 07:47:10.727641 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.083823 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.129771 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerStarted","Data":"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8"} Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.130135 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.162279 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79289dc-3eed-4009-985d-03e6a8fd36ce" path="/var/lib/kubelet/pods/b79289dc-3eed-4009-985d-03e6a8fd36ce/volumes" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.171536 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.171877 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="dnsmasq-dns" containerID="cri-o://9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d" gracePeriod=10 Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.207173 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.631191709 podStartE2EDuration="11.207142641s" podCreationTimestamp="2025-12-02 07:47:00 +0000 UTC" firstStartedPulling="2025-12-02 07:47:02.497480787 +0000 UTC m=+1433.668340400" lastFinishedPulling="2025-12-02 07:47:10.073431719 +0000 UTC m=+1441.244291332" observedRunningTime="2025-12-02 07:47:11.16041852 +0000 UTC m=+1442.331278153" watchObservedRunningTime="2025-12-02 07:47:11.207142641 +0000 UTC m=+1442.378002254" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.209150 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.213529 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-km4tm" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" probeResult="failure" output=< Dec 02 07:47:11 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:47:11 crc kubenswrapper[4895]: > Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.303448 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.434659 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.476679 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.180:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:11 crc kubenswrapper[4895]: I1202 07:47:11.856581 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014237 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014326 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8sm2\" (UniqueName: \"kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014361 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014397 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.014500 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb\") pod \"8510271f-316d-4292-8186-a8003fea402a\" (UID: \"8510271f-316d-4292-8186-a8003fea402a\") " Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.024013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2" (OuterVolumeSpecName: "kube-api-access-d8sm2") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "kube-api-access-d8sm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.092419 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.108395 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.114213 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config" (OuterVolumeSpecName: "config") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.114915 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.118813 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.119991 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8sm2\" (UniqueName: \"kubernetes.io/projected/8510271f-316d-4292-8186-a8003fea402a-kube-api-access-d8sm2\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.120016 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.120028 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.120039 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.130688 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8510271f-316d-4292-8186-a8003fea402a" (UID: "8510271f-316d-4292-8186-a8003fea402a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.145638 4895 generic.go:334] "Generic (PLEG): container finished" podID="8510271f-316d-4292-8186-a8003fea402a" containerID="9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d" exitCode=0 Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.145715 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.145729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" event={"ID":"8510271f-316d-4292-8186-a8003fea402a","Type":"ContainerDied","Data":"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d"} Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.145783 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jl55c" event={"ID":"8510271f-316d-4292-8186-a8003fea402a","Type":"ContainerDied","Data":"f5a1ee8b278e30a9be1958e42c02e647676dd00646d873fb4d16da06fdc6aab0"} Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.145805 4895 scope.go:117] "RemoveContainer" containerID="9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.154809 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerStarted","Data":"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd"} Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.155134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerStarted","Data":"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341"} Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.155203 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerStarted","Data":"733aa10facd67f6b8cc161505740a6438dcebae8c3e742e305f783fb27d4c4b9"} Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.180811 4895 scope.go:117] "RemoveContainer" containerID="95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.188294 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.188271657 podStartE2EDuration="2.188271657s" podCreationTimestamp="2025-12-02 07:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:12.183139278 +0000 UTC m=+1443.353998891" watchObservedRunningTime="2025-12-02 07:47:12.188271657 +0000 UTC m=+1443.359131270" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.222065 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8510271f-316d-4292-8186-a8003fea402a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.228987 4895 scope.go:117] "RemoveContainer" containerID="9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.233254 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:47:12 crc kubenswrapper[4895]: E1202 07:47:12.233671 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d\": container with ID starting with 9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d not found: ID does not exist" containerID="9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.233700 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d"} err="failed to get container status \"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d\": rpc error: code = NotFound desc = could not find container \"9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d\": container with ID starting with 9ebb1deb43c00b84cf99cfd9c031bff1aff735c6eac15fb6a3fd84fb0f9c894d not found: ID does not exist" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.233725 4895 scope.go:117] "RemoveContainer" containerID="95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05" Dec 02 07:47:12 crc kubenswrapper[4895]: E1202 07:47:12.234222 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05\": container with ID starting with 95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05 not found: ID does not exist" containerID="95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.234249 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05"} err="failed to get container status \"95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05\": rpc error: code = NotFound desc = could not find container \"95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05\": container with ID starting with 95aa6bd0d063807af177fa72abf0b5a4edc46fca52ead31062b8deb065437f05 not found: ID does not exist" Dec 02 07:47:12 crc kubenswrapper[4895]: I1202 07:47:12.263355 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jl55c"] Dec 02 07:47:13 crc kubenswrapper[4895]: I1202 07:47:13.154719 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8510271f-316d-4292-8186-a8003fea402a" path="/var/lib/kubelet/pods/8510271f-316d-4292-8186-a8003fea402a/volumes" Dec 02 07:47:13 crc kubenswrapper[4895]: I1202 07:47:13.174074 4895 generic.go:334] "Generic (PLEG): container finished" podID="4e5d3f70-f931-473a-af3c-e0858a46e311" containerID="62b374c4faec2438d9bd41034a5738b43a2bf2fbe98618d82d77b24eeb955851" exitCode=0 Dec 02 07:47:13 crc kubenswrapper[4895]: I1202 07:47:13.174165 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9wgm" event={"ID":"4e5d3f70-f931-473a-af3c-e0858a46e311","Type":"ContainerDied","Data":"62b374c4faec2438d9bd41034a5738b43a2bf2fbe98618d82d77b24eeb955851"} Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.634142 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.790945 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjtlt\" (UniqueName: \"kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt\") pod \"4e5d3f70-f931-473a-af3c-e0858a46e311\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.792360 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data\") pod \"4e5d3f70-f931-473a-af3c-e0858a46e311\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.792395 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts\") pod \"4e5d3f70-f931-473a-af3c-e0858a46e311\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.792440 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle\") pod \"4e5d3f70-f931-473a-af3c-e0858a46e311\" (UID: \"4e5d3f70-f931-473a-af3c-e0858a46e311\") " Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.805160 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts" (OuterVolumeSpecName: "scripts") pod "4e5d3f70-f931-473a-af3c-e0858a46e311" (UID: "4e5d3f70-f931-473a-af3c-e0858a46e311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.805332 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt" (OuterVolumeSpecName: "kube-api-access-tjtlt") pod "4e5d3f70-f931-473a-af3c-e0858a46e311" (UID: "4e5d3f70-f931-473a-af3c-e0858a46e311"). InnerVolumeSpecName "kube-api-access-tjtlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.826730 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e5d3f70-f931-473a-af3c-e0858a46e311" (UID: "4e5d3f70-f931-473a-af3c-e0858a46e311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.833844 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data" (OuterVolumeSpecName: "config-data") pod "4e5d3f70-f931-473a-af3c-e0858a46e311" (UID: "4e5d3f70-f931-473a-af3c-e0858a46e311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.894926 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.895343 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.895353 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5d3f70-f931-473a-af3c-e0858a46e311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:14 crc kubenswrapper[4895]: I1202 07:47:14.895367 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjtlt\" (UniqueName: \"kubernetes.io/projected/4e5d3f70-f931-473a-af3c-e0858a46e311-kube-api-access-tjtlt\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.197078 4895 generic.go:334] "Generic (PLEG): container finished" podID="845f4c45-cc2f-4b99-a42f-3c04b18730fe" containerID="391b022ce7caf5397988776e4babbf6233f29f15977989bdaf58d428c0573bde" exitCode=0 Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.197188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8svml" event={"ID":"845f4c45-cc2f-4b99-a42f-3c04b18730fe","Type":"ContainerDied","Data":"391b022ce7caf5397988776e4babbf6233f29f15977989bdaf58d428c0573bde"} Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.199067 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9wgm" event={"ID":"4e5d3f70-f931-473a-af3c-e0858a46e311","Type":"ContainerDied","Data":"2735838beb06df8c15994d68505d185c2efcc5d02191cc86b3dc7d5612f9685f"} Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.199155 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9wgm" Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.199257 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2735838beb06df8c15994d68505d185c2efcc5d02191cc86b3dc7d5612f9685f" Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.402491 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.402792 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-log" containerID="cri-o://3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d" gracePeriod=30 Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.402963 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-api" containerID="cri-o://685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566" gracePeriod=30 Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.417667 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.418109 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerName="nova-scheduler-scheduler" containerID="cri-o://59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" gracePeriod=30 Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.454128 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.454465 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-log" containerID="cri-o://f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" gracePeriod=30 Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.454588 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-metadata" containerID="cri-o://5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" gracePeriod=30 Dec 02 07:47:15 crc kubenswrapper[4895]: E1202 07:47:15.589734 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:47:15 crc kubenswrapper[4895]: E1202 07:47:15.595257 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:47:15 crc kubenswrapper[4895]: E1202 07:47:15.596631 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:47:15 crc kubenswrapper[4895]: E1202 07:47:15.596681 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerName="nova-scheduler-scheduler" Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.728631 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:47:15 crc kubenswrapper[4895]: I1202 07:47:15.728715 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.152690 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239056 4895 generic.go:334] "Generic (PLEG): container finished" podID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerID="5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" exitCode=0 Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239102 4895 generic.go:334] "Generic (PLEG): container finished" podID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerID="f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" exitCode=143 Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239152 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerDied","Data":"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd"} Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239187 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerDied","Data":"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341"} Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dcdf64bc-7466-4639-a0a3-5cb16bed968a","Type":"ContainerDied","Data":"733aa10facd67f6b8cc161505740a6438dcebae8c3e742e305f783fb27d4c4b9"} Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239218 4895 scope.go:117] "RemoveContainer" containerID="5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.239909 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.247003 4895 generic.go:334] "Generic (PLEG): container finished" podID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerID="3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d" exitCode=143 Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.247151 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerDied","Data":"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d"} Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.278165 4895 scope.go:117] "RemoveContainer" containerID="f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.304222 4895 scope.go:117] "RemoveContainer" containerID="5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.304672 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd\": container with ID starting with 5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd not found: ID does not exist" containerID="5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.304714 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd"} err="failed to get container status \"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd\": rpc error: code = NotFound desc = could not find container \"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd\": container with ID starting with 5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd not found: ID does not exist" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.304755 4895 scope.go:117] "RemoveContainer" containerID="f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.305363 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341\": container with ID starting with f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341 not found: ID does not exist" containerID="f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.305441 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341"} err="failed to get container status \"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341\": rpc error: code = NotFound desc = could not find container \"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341\": container with ID starting with f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341 not found: ID does not exist" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.305495 4895 scope.go:117] "RemoveContainer" containerID="5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.305917 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd"} err="failed to get container status \"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd\": rpc error: code = NotFound desc = could not find container \"5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd\": container with ID starting with 5630c3665a92c590406a32e2d1495a8fd7a377c33abc271648c80730917decbd not found: ID does not exist" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.306033 4895 scope.go:117] "RemoveContainer" containerID="f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.306568 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341"} err="failed to get container status \"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341\": rpc error: code = NotFound desc = could not find container \"f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341\": container with ID starting with f85e7cc72c51564223e35e7e78a2fd3acda4e4324a577c2dc648ed358a0e2341 not found: ID does not exist" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.338486 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data\") pod \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.338935 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle\") pod \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.339009 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs\") pod \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.339043 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d66gw\" (UniqueName: \"kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw\") pod \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.339082 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs\") pod \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\" (UID: \"dcdf64bc-7466-4639-a0a3-5cb16bed968a\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.340286 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs" (OuterVolumeSpecName: "logs") pod "dcdf64bc-7466-4639-a0a3-5cb16bed968a" (UID: "dcdf64bc-7466-4639-a0a3-5cb16bed968a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.382059 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw" (OuterVolumeSpecName: "kube-api-access-d66gw") pod "dcdf64bc-7466-4639-a0a3-5cb16bed968a" (UID: "dcdf64bc-7466-4639-a0a3-5cb16bed968a"). InnerVolumeSpecName "kube-api-access-d66gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.412171 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcdf64bc-7466-4639-a0a3-5cb16bed968a" (UID: "dcdf64bc-7466-4639-a0a3-5cb16bed968a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.436010 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data" (OuterVolumeSpecName: "config-data") pod "dcdf64bc-7466-4639-a0a3-5cb16bed968a" (UID: "dcdf64bc-7466-4639-a0a3-5cb16bed968a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.443199 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.443231 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.443244 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d66gw\" (UniqueName: \"kubernetes.io/projected/dcdf64bc-7466-4639-a0a3-5cb16bed968a-kube-api-access-d66gw\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.443254 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcdf64bc-7466-4639-a0a3-5cb16bed968a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.470617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dcdf64bc-7466-4639-a0a3-5cb16bed968a" (UID: "dcdf64bc-7466-4639-a0a3-5cb16bed968a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.545406 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcdf64bc-7466-4639-a0a3-5cb16bed968a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.685126 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.705377 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.721093 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.743782 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744544 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="dnsmasq-dns" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744567 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="dnsmasq-dns" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744593 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-log" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744604 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-log" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744616 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845f4c45-cc2f-4b99-a42f-3c04b18730fe" containerName="nova-cell1-conductor-db-sync" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744624 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="845f4c45-cc2f-4b99-a42f-3c04b18730fe" containerName="nova-cell1-conductor-db-sync" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744642 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-metadata" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744649 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-metadata" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744667 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5d3f70-f931-473a-af3c-e0858a46e311" containerName="nova-manage" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744672 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5d3f70-f931-473a-af3c-e0858a46e311" containerName="nova-manage" Dec 02 07:47:16 crc kubenswrapper[4895]: E1202 07:47:16.744699 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="init" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744705 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="init" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744928 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-log" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744942 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" containerName="nova-metadata-metadata" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744956 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="845f4c45-cc2f-4b99-a42f-3c04b18730fe" containerName="nova-cell1-conductor-db-sync" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744968 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8510271f-316d-4292-8186-a8003fea402a" containerName="dnsmasq-dns" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.744982 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5d3f70-f931-473a-af3c-e0858a46e311" containerName="nova-manage" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.750021 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.754221 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.754637 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.756772 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.851538 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn229\" (UniqueName: \"kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229\") pod \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.852014 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data\") pod \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.852178 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts\") pod \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.852543 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle\") pod \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\" (UID: \"845f4c45-cc2f-4b99-a42f-3c04b18730fe\") " Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.853086 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blbqv\" (UniqueName: \"kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.853299 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.853441 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.853622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.853816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.870710 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229" (OuterVolumeSpecName: "kube-api-access-bn229") pod "845f4c45-cc2f-4b99-a42f-3c04b18730fe" (UID: "845f4c45-cc2f-4b99-a42f-3c04b18730fe"). InnerVolumeSpecName "kube-api-access-bn229". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.870980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts" (OuterVolumeSpecName: "scripts") pod "845f4c45-cc2f-4b99-a42f-3c04b18730fe" (UID: "845f4c45-cc2f-4b99-a42f-3c04b18730fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.886044 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "845f4c45-cc2f-4b99-a42f-3c04b18730fe" (UID: "845f4c45-cc2f-4b99-a42f-3c04b18730fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.890082 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data" (OuterVolumeSpecName: "config-data") pod "845f4c45-cc2f-4b99-a42f-3c04b18730fe" (UID: "845f4c45-cc2f-4b99-a42f-3c04b18730fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.957684 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.957814 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blbqv\" (UniqueName: \"kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.957876 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.957905 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.957940 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.958009 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.958026 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn229\" (UniqueName: \"kubernetes.io/projected/845f4c45-cc2f-4b99-a42f-3c04b18730fe-kube-api-access-bn229\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.958040 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.958050 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/845f4c45-cc2f-4b99-a42f-3c04b18730fe-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.959450 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.962795 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.963397 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.966483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:16 crc kubenswrapper[4895]: I1202 07:47:16.977940 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blbqv\" (UniqueName: \"kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv\") pod \"nova-metadata-0\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " pod="openstack/nova-metadata-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.081168 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.155712 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcdf64bc-7466-4639-a0a3-5cb16bed968a" path="/var/lib/kubelet/pods/dcdf64bc-7466-4639-a0a3-5cb16bed968a/volumes" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.290013 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8svml" event={"ID":"845f4c45-cc2f-4b99-a42f-3c04b18730fe","Type":"ContainerDied","Data":"a0710d417855ab420b0183608216508610497883652cb830574cff3ead83f14a"} Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.290110 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8svml" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.292497 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0710d417855ab420b0183608216508610497883652cb830574cff3ead83f14a" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.339176 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.342056 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.352303 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.359447 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.474010 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.474376 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.474494 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5fq\" (UniqueName: \"kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.577209 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.577274 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5fq\" (UniqueName: \"kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.577401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.585696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.585897 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.597877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5fq\" (UniqueName: \"kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq\") pod \"nova-cell1-conductor-0\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.643315 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:17 crc kubenswrapper[4895]: I1202 07:47:17.675811 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:18 crc kubenswrapper[4895]: W1202 07:47:18.214568 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31223325_1372_4ea6_867e_f511b7dffc09.slice/crio-9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311 WatchSource:0}: Error finding container 9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311: Status 404 returned error can't find the container with id 9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311 Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.217301 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.301832 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerStarted","Data":"d33e2d232e3b83713b4b5bc7098bf0d68f44aaac578ecdb93f62f57e2d517660"} Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.301904 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerStarted","Data":"7a93b7b1fa5e4df6ef9a5fd45c7bfb4f2453204ed865bcce093ea30558cc6ba4"} Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.301918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerStarted","Data":"a035b29b4053c54ec1c26c456cf2504604fab837bd6d2ac03adce178b1636081"} Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.305466 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31223325-1372-4ea6-867e-f511b7dffc09","Type":"ContainerStarted","Data":"9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311"} Dec 02 07:47:18 crc kubenswrapper[4895]: I1202 07:47:18.333097 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.333076013 podStartE2EDuration="2.333076013s" podCreationTimestamp="2025-12-02 07:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:18.329547225 +0000 UTC m=+1449.500406838" watchObservedRunningTime="2025-12-02 07:47:18.333076013 +0000 UTC m=+1449.503935626" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.130200 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.256684 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62ssm\" (UniqueName: \"kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm\") pod \"087ae923-798c-4c9d-bdbb-43d64df1710a\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.258706 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle\") pod \"087ae923-798c-4c9d-bdbb-43d64df1710a\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.258851 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs\") pod \"087ae923-798c-4c9d-bdbb-43d64df1710a\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.258936 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data\") pod \"087ae923-798c-4c9d-bdbb-43d64df1710a\" (UID: \"087ae923-798c-4c9d-bdbb-43d64df1710a\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.260115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs" (OuterVolumeSpecName: "logs") pod "087ae923-798c-4c9d-bdbb-43d64df1710a" (UID: "087ae923-798c-4c9d-bdbb-43d64df1710a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.263660 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/087ae923-798c-4c9d-bdbb-43d64df1710a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.267801 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm" (OuterVolumeSpecName: "kube-api-access-62ssm") pod "087ae923-798c-4c9d-bdbb-43d64df1710a" (UID: "087ae923-798c-4c9d-bdbb-43d64df1710a"). InnerVolumeSpecName "kube-api-access-62ssm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.295901 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data" (OuterVolumeSpecName: "config-data") pod "087ae923-798c-4c9d-bdbb-43d64df1710a" (UID: "087ae923-798c-4c9d-bdbb-43d64df1710a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.304354 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "087ae923-798c-4c9d-bdbb-43d64df1710a" (UID: "087ae923-798c-4c9d-bdbb-43d64df1710a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.319136 4895 generic.go:334] "Generic (PLEG): container finished" podID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerID="59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" exitCode=0 Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.319297 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76dc5ff2-192c-42e5-80ca-17b405814be6","Type":"ContainerDied","Data":"59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b"} Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.335368 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31223325-1372-4ea6-867e-f511b7dffc09","Type":"ContainerStarted","Data":"45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6"} Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.335563 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.344446 4895 generic.go:334] "Generic (PLEG): container finished" podID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerID="685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566" exitCode=0 Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.345013 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerDied","Data":"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566"} Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.345227 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"087ae923-798c-4c9d-bdbb-43d64df1710a","Type":"ContainerDied","Data":"c6256e9b736585193e92d2062e31cd6da7cd9f8a6a4255ea477f4f0319606403"} Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.345447 4895 scope.go:117] "RemoveContainer" containerID="685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.345102 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.366056 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.366108 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/087ae923-798c-4c9d-bdbb-43d64df1710a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.366122 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62ssm\" (UniqueName: \"kubernetes.io/projected/087ae923-798c-4c9d-bdbb-43d64df1710a-kube-api-access-62ssm\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.380507 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.380448213 podStartE2EDuration="2.380448213s" podCreationTimestamp="2025-12-02 07:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:19.356603558 +0000 UTC m=+1450.527463171" watchObservedRunningTime="2025-12-02 07:47:19.380448213 +0000 UTC m=+1450.551307826" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.445846 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.455301 4895 scope.go:117] "RemoveContainer" containerID="3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.464107 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.479812 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:19 crc kubenswrapper[4895]: E1202 07:47:19.480611 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-api" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.480640 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-api" Dec 02 07:47:19 crc kubenswrapper[4895]: E1202 07:47:19.480663 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-log" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.480672 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-log" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.480973 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-api" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.480991 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" containerName="nova-api-log" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.486171 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.488025 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.491259 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.495136 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.498761 4895 scope.go:117] "RemoveContainer" containerID="685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566" Dec 02 07:47:19 crc kubenswrapper[4895]: E1202 07:47:19.500388 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566\": container with ID starting with 685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566 not found: ID does not exist" containerID="685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.500417 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566"} err="failed to get container status \"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566\": rpc error: code = NotFound desc = could not find container \"685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566\": container with ID starting with 685053df2014249b5147b703ed5893284e684dacba668ae65f0140e012662566 not found: ID does not exist" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.500442 4895 scope.go:117] "RemoveContainer" containerID="3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d" Dec 02 07:47:19 crc kubenswrapper[4895]: E1202 07:47:19.515619 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d\": container with ID starting with 3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d not found: ID does not exist" containerID="3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.515673 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d"} err="failed to get container status \"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d\": rpc error: code = NotFound desc = could not find container \"3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d\": container with ID starting with 3a2c02e21f0ccfffeea8ce8656f74521c9dfa25df5e758e18ce52926179c831d not found: ID does not exist" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.580293 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4jw\" (UniqueName: \"kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw\") pod \"76dc5ff2-192c-42e5-80ca-17b405814be6\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.580451 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle\") pod \"76dc5ff2-192c-42e5-80ca-17b405814be6\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.580571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data\") pod \"76dc5ff2-192c-42e5-80ca-17b405814be6\" (UID: \"76dc5ff2-192c-42e5-80ca-17b405814be6\") " Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.581717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.581867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9b8\" (UniqueName: \"kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.581920 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.582134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.585125 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw" (OuterVolumeSpecName: "kube-api-access-gd4jw") pod "76dc5ff2-192c-42e5-80ca-17b405814be6" (UID: "76dc5ff2-192c-42e5-80ca-17b405814be6"). InnerVolumeSpecName "kube-api-access-gd4jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.611714 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data" (OuterVolumeSpecName: "config-data") pod "76dc5ff2-192c-42e5-80ca-17b405814be6" (UID: "76dc5ff2-192c-42e5-80ca-17b405814be6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.614516 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76dc5ff2-192c-42e5-80ca-17b405814be6" (UID: "76dc5ff2-192c-42e5-80ca-17b405814be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.683947 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9b8\" (UniqueName: \"kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684110 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684297 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684392 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4jw\" (UniqueName: \"kubernetes.io/projected/76dc5ff2-192c-42e5-80ca-17b405814be6-kube-api-access-gd4jw\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684422 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.684441 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc5ff2-192c-42e5-80ca-17b405814be6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.686198 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.691461 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.691544 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.709310 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9b8\" (UniqueName: \"kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8\") pod \"nova-api-0\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " pod="openstack/nova-api-0" Dec 02 07:47:19 crc kubenswrapper[4895]: I1202 07:47:19.826192 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.325212 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.355611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerStarted","Data":"2db030db2c17b158bcafd98741b114d8b463b02291270ba8b658c53b266309e3"} Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.360805 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.360841 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"76dc5ff2-192c-42e5-80ca-17b405814be6","Type":"ContainerDied","Data":"19108097f3d351fdb527e0c3cdfaebd4166bcda9a933a5416a0455e6549af455"} Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.360964 4895 scope.go:117] "RemoveContainer" containerID="59af142141d77afa67328db26a0d0ea0636463b23d3efd326b7355e7a7cba09b" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.410947 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.427344 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.439849 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:20 crc kubenswrapper[4895]: E1202 07:47:20.440494 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerName="nova-scheduler-scheduler" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.440523 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerName="nova-scheduler-scheduler" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.440874 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" containerName="nova-scheduler-scheduler" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.441830 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.444582 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.449840 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.502202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mt4l\" (UniqueName: \"kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.502296 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.503333 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.604977 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.605089 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mt4l\" (UniqueName: \"kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.605136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.619562 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.619703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.621664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mt4l\" (UniqueName: \"kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l\") pod \"nova-scheduler-0\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " pod="openstack/nova-scheduler-0" Dec 02 07:47:20 crc kubenswrapper[4895]: I1202 07:47:20.772908 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.153377 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087ae923-798c-4c9d-bdbb-43d64df1710a" path="/var/lib/kubelet/pods/087ae923-798c-4c9d-bdbb-43d64df1710a/volumes" Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.155690 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dc5ff2-192c-42e5-80ca-17b405814be6" path="/var/lib/kubelet/pods/76dc5ff2-192c-42e5-80ca-17b405814be6/volumes" Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.196797 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-km4tm" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" probeResult="failure" output=< Dec 02 07:47:21 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:47:21 crc kubenswrapper[4895]: > Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.262781 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:21 crc kubenswrapper[4895]: W1202 07:47:21.265082 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c347298_8afb_4598_ba75_64cd23db0935.slice/crio-40ff0a58a2c649fb977b6d4a2e60dda4b909165d9ab04fab3df6a841dca7625b WatchSource:0}: Error finding container 40ff0a58a2c649fb977b6d4a2e60dda4b909165d9ab04fab3df6a841dca7625b: Status 404 returned error can't find the container with id 40ff0a58a2c649fb977b6d4a2e60dda4b909165d9ab04fab3df6a841dca7625b Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.378655 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c347298-8afb-4598-ba75-64cd23db0935","Type":"ContainerStarted","Data":"40ff0a58a2c649fb977b6d4a2e60dda4b909165d9ab04fab3df6a841dca7625b"} Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.381769 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerStarted","Data":"9c6c4a34755af17a63fa22b1a3b2be0d498dc4cbba5ef1a8c1c29481bc2c0d85"} Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.381818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerStarted","Data":"6c75181469b7e4e874c6cb08cc5589b4e51ef084d9a3f044bd9ad3e485bec0ff"} Dec 02 07:47:21 crc kubenswrapper[4895]: I1202 07:47:21.406689 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4066696 podStartE2EDuration="2.4066696s" podCreationTimestamp="2025-12-02 07:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:21.406445973 +0000 UTC m=+1452.577305596" watchObservedRunningTime="2025-12-02 07:47:21.4066696 +0000 UTC m=+1452.577529223" Dec 02 07:47:22 crc kubenswrapper[4895]: I1202 07:47:22.081816 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:47:22 crc kubenswrapper[4895]: I1202 07:47:22.082394 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:47:22 crc kubenswrapper[4895]: I1202 07:47:22.393769 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c347298-8afb-4598-ba75-64cd23db0935","Type":"ContainerStarted","Data":"1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29"} Dec 02 07:47:25 crc kubenswrapper[4895]: I1202 07:47:25.773440 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 07:47:27 crc kubenswrapper[4895]: I1202 07:47:27.082013 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 07:47:27 crc kubenswrapper[4895]: I1202 07:47:27.082615 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 07:47:27 crc kubenswrapper[4895]: I1202 07:47:27.705015 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 07:47:27 crc kubenswrapper[4895]: I1202 07:47:27.737679 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=7.737655518 podStartE2EDuration="7.737655518s" podCreationTimestamp="2025-12-02 07:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.419485163 +0000 UTC m=+1453.590344796" watchObservedRunningTime="2025-12-02 07:47:27.737655518 +0000 UTC m=+1458.908515131" Dec 02 07:47:28 crc kubenswrapper[4895]: I1202 07:47:28.100991 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:28 crc kubenswrapper[4895]: I1202 07:47:28.101364 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:29 crc kubenswrapper[4895]: I1202 07:47:29.827983 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:47:29 crc kubenswrapper[4895]: I1202 07:47:29.828504 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.311167 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.370225 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.773553 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.815141 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.910072 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.910606 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:47:30 crc kubenswrapper[4895]: I1202 07:47:30.955818 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:47:31 crc kubenswrapper[4895]: I1202 07:47:31.369258 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 07:47:31 crc kubenswrapper[4895]: I1202 07:47:31.492351 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-km4tm" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" containerID="cri-o://96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b" gracePeriod=2 Dec 02 07:47:31 crc kubenswrapper[4895]: I1202 07:47:31.529387 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.045800 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.098383 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content\") pod \"23d7d649-194e-4822-9ac2-9badcf844980\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.098517 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities\") pod \"23d7d649-194e-4822-9ac2-9badcf844980\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.098545 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppbcm\" (UniqueName: \"kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm\") pod \"23d7d649-194e-4822-9ac2-9badcf844980\" (UID: \"23d7d649-194e-4822-9ac2-9badcf844980\") " Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.099412 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities" (OuterVolumeSpecName: "utilities") pod "23d7d649-194e-4822-9ac2-9badcf844980" (UID: "23d7d649-194e-4822-9ac2-9badcf844980"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.105196 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm" (OuterVolumeSpecName: "kube-api-access-ppbcm") pod "23d7d649-194e-4822-9ac2-9badcf844980" (UID: "23d7d649-194e-4822-9ac2-9badcf844980"). InnerVolumeSpecName "kube-api-access-ppbcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.201005 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.201309 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppbcm\" (UniqueName: \"kubernetes.io/projected/23d7d649-194e-4822-9ac2-9badcf844980-kube-api-access-ppbcm\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.227605 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23d7d649-194e-4822-9ac2-9badcf844980" (UID: "23d7d649-194e-4822-9ac2-9badcf844980"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.304458 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23d7d649-194e-4822-9ac2-9badcf844980-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.503813 4895 generic.go:334] "Generic (PLEG): container finished" podID="23d7d649-194e-4822-9ac2-9badcf844980" containerID="96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b" exitCode=0 Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.504780 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km4tm" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.507057 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerDied","Data":"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b"} Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.507138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km4tm" event={"ID":"23d7d649-194e-4822-9ac2-9badcf844980","Type":"ContainerDied","Data":"25e7965d92ffced6a51dc1ff1f88eef2b4ed9ac7682fd21def6aeea5e86994fe"} Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.507161 4895 scope.go:117] "RemoveContainer" containerID="96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.536651 4895 scope.go:117] "RemoveContainer" containerID="de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.555642 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.569228 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-km4tm"] Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.585718 4895 scope.go:117] "RemoveContainer" containerID="562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.618693 4895 scope.go:117] "RemoveContainer" containerID="96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b" Dec 02 07:47:32 crc kubenswrapper[4895]: E1202 07:47:32.619398 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b\": container with ID starting with 96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b not found: ID does not exist" containerID="96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.619475 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b"} err="failed to get container status \"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b\": rpc error: code = NotFound desc = could not find container \"96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b\": container with ID starting with 96fb012b7e72edf5e8416df09d5a93ca5504eb98a74ca683eed935638bfe243b not found: ID does not exist" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.619524 4895 scope.go:117] "RemoveContainer" containerID="de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84" Dec 02 07:47:32 crc kubenswrapper[4895]: E1202 07:47:32.620154 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84\": container with ID starting with de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84 not found: ID does not exist" containerID="de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.620216 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84"} err="failed to get container status \"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84\": rpc error: code = NotFound desc = could not find container \"de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84\": container with ID starting with de70ac10da286b83b5b9bbe5f2bc6e605a8bd878e07dc3dd045bc119eea20b84 not found: ID does not exist" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.620244 4895 scope.go:117] "RemoveContainer" containerID="562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55" Dec 02 07:47:32 crc kubenswrapper[4895]: E1202 07:47:32.620531 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55\": container with ID starting with 562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55 not found: ID does not exist" containerID="562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55" Dec 02 07:47:32 crc kubenswrapper[4895]: I1202 07:47:32.620564 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55"} err="failed to get container status \"562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55\": rpc error: code = NotFound desc = could not find container \"562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55\": container with ID starting with 562183949f440640ad90c8b7ef79ee4da6e42b0eeb3d5b0f5d6fb78739832a55 not found: ID does not exist" Dec 02 07:47:33 crc kubenswrapper[4895]: I1202 07:47:33.156392 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d7d649-194e-4822-9ac2-9badcf844980" path="/var/lib/kubelet/pods/23d7d649-194e-4822-9ac2-9badcf844980/volumes" Dec 02 07:47:35 crc kubenswrapper[4895]: I1202 07:47:35.509211 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:35 crc kubenswrapper[4895]: I1202 07:47:35.509970 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a6606596-020b-4584-b9d2-8606a794a726" containerName="kube-state-metrics" containerID="cri-o://96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015" gracePeriod=30 Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.020669 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.121878 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4zs8\" (UniqueName: \"kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8\") pod \"a6606596-020b-4584-b9d2-8606a794a726\" (UID: \"a6606596-020b-4584-b9d2-8606a794a726\") " Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.141233 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8" (OuterVolumeSpecName: "kube-api-access-b4zs8") pod "a6606596-020b-4584-b9d2-8606a794a726" (UID: "a6606596-020b-4584-b9d2-8606a794a726"). InnerVolumeSpecName "kube-api-access-b4zs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.224024 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4zs8\" (UniqueName: \"kubernetes.io/projected/a6606596-020b-4584-b9d2-8606a794a726-kube-api-access-b4zs8\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.549984 4895 generic.go:334] "Generic (PLEG): container finished" podID="a6606596-020b-4584-b9d2-8606a794a726" containerID="96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015" exitCode=2 Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.550042 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6606596-020b-4584-b9d2-8606a794a726","Type":"ContainerDied","Data":"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015"} Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.550080 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6606596-020b-4584-b9d2-8606a794a726","Type":"ContainerDied","Data":"514d12a47b7555c807c0fc7cc849d58d67bebe8585b02344ceacecd024355d43"} Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.550084 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.550102 4895 scope.go:117] "RemoveContainer" containerID="96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.590578 4895 scope.go:117] "RemoveContainer" containerID="96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015" Dec 02 07:47:36 crc kubenswrapper[4895]: E1202 07:47:36.591291 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015\": container with ID starting with 96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015 not found: ID does not exist" containerID="96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.591353 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015"} err="failed to get container status \"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015\": rpc error: code = NotFound desc = could not find container \"96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015\": container with ID starting with 96c28dfc42cb25542267c38bbb8908b3a9df5fe26c5c721fc8d65994c2f6e015 not found: ID does not exist" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.591415 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.623273 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.637238 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:36 crc kubenswrapper[4895]: E1202 07:47:36.637733 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="extract-utilities" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.637767 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="extract-utilities" Dec 02 07:47:36 crc kubenswrapper[4895]: E1202 07:47:36.637794 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.637803 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" Dec 02 07:47:36 crc kubenswrapper[4895]: E1202 07:47:36.637822 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6606596-020b-4584-b9d2-8606a794a726" containerName="kube-state-metrics" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.637829 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6606596-020b-4584-b9d2-8606a794a726" containerName="kube-state-metrics" Dec 02 07:47:36 crc kubenswrapper[4895]: E1202 07:47:36.637838 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="extract-content" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.637844 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="extract-content" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.638066 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d7d649-194e-4822-9ac2-9badcf844980" containerName="registry-server" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.638088 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6606596-020b-4584-b9d2-8606a794a726" containerName="kube-state-metrics" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.639186 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.641995 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.642231 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.651765 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.837675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.838261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.838399 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.838487 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnfp\" (UniqueName: \"kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.941322 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.942450 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnfp\" (UniqueName: \"kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.942498 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.942536 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.949197 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.949551 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.951331 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:36 crc kubenswrapper[4895]: I1202 07:47:36.965593 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnfp\" (UniqueName: \"kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp\") pod \"kube-state-metrics-0\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " pod="openstack/kube-state-metrics-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.090340 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.093372 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.098833 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.168198 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6606596-020b-4584-b9d2-8606a794a726" path="/var/lib/kubelet/pods/a6606596-020b-4584-b9d2-8606a794a726/volumes" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.262000 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.416475 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.534135 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.534582 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-central-agent" containerID="cri-o://db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7" gracePeriod=30 Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.534662 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="proxy-httpd" containerID="cri-o://2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8" gracePeriod=30 Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.534770 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="sg-core" containerID="cri-o://45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525" gracePeriod=30 Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.534817 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-notification-agent" containerID="cri-o://d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c" gracePeriod=30 Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.555569 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmkrq\" (UniqueName: \"kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq\") pod \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.556248 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle\") pod \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.556382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data\") pod \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\" (UID: \"1e54197c-b432-4f6e-9bd9-ce1f15fde624\") " Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.563144 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq" (OuterVolumeSpecName: "kube-api-access-fmkrq") pod "1e54197c-b432-4f6e-9bd9-ce1f15fde624" (UID: "1e54197c-b432-4f6e-9bd9-ce1f15fde624"). InnerVolumeSpecName "kube-api-access-fmkrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.575945 4895 generic.go:334] "Generic (PLEG): container finished" podID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" containerID="73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174" exitCode=137 Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.576026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e54197c-b432-4f6e-9bd9-ce1f15fde624","Type":"ContainerDied","Data":"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174"} Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.576063 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1e54197c-b432-4f6e-9bd9-ce1f15fde624","Type":"ContainerDied","Data":"7b806473908ebc9882572a08ca9406b0f79310e09588a651b7891e009f4df689"} Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.576085 4895 scope.go:117] "RemoveContainer" containerID="73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.576207 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.591902 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.602991 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e54197c-b432-4f6e-9bd9-ce1f15fde624" (UID: "1e54197c-b432-4f6e-9bd9-ce1f15fde624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.632615 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data" (OuterVolumeSpecName: "config-data") pod "1e54197c-b432-4f6e-9bd9-ce1f15fde624" (UID: "1e54197c-b432-4f6e-9bd9-ce1f15fde624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.633707 4895 scope.go:117] "RemoveContainer" containerID="73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174" Dec 02 07:47:37 crc kubenswrapper[4895]: E1202 07:47:37.639339 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174\": container with ID starting with 73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174 not found: ID does not exist" containerID="73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.639448 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174"} err="failed to get container status \"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174\": rpc error: code = NotFound desc = could not find container \"73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174\": container with ID starting with 73869ae3518959278676d1703443fc666432748848ab698854d82c7dfc5aa174 not found: ID does not exist" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.682602 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmkrq\" (UniqueName: \"kubernetes.io/projected/1e54197c-b432-4f6e-9bd9-ce1f15fde624-kube-api-access-fmkrq\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.684753 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.684781 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e54197c-b432-4f6e-9bd9-ce1f15fde624-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.802078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.921691 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.937341 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.952627 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:37 crc kubenswrapper[4895]: E1202 07:47:37.953381 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.953408 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.953764 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.956965 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.958952 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.963980 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.964035 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 07:47:37 crc kubenswrapper[4895]: I1202 07:47:37.970801 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.096262 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.096356 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.096388 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9rzn\" (UniqueName: \"kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.096407 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.096476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.199950 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.200553 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.200724 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9rzn\" (UniqueName: \"kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.201421 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.202052 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.205766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.207143 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.207936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.210382 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.222960 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9rzn\" (UniqueName: \"kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.340257 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.597840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a3ec758-e19e-4286-bfed-a1d6d3010bfb","Type":"ContainerStarted","Data":"7dc2853c20a38045953efd3752aa502543cbbe08dd450481c9d49ada9a7e28ab"} Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.598427 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a3ec758-e19e-4286-bfed-a1d6d3010bfb","Type":"ContainerStarted","Data":"8bdd186a6b7674f05fab18e7c56a6b0e62a67b5a53a50271787e7d6eeeda8493"} Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.599085 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605532 4895 generic.go:334] "Generic (PLEG): container finished" podID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerID="2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8" exitCode=0 Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605618 4895 generic.go:334] "Generic (PLEG): container finished" podID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerID="45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525" exitCode=2 Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605627 4895 generic.go:334] "Generic (PLEG): container finished" podID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerID="db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7" exitCode=0 Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605844 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerDied","Data":"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8"} Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605901 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerDied","Data":"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525"} Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.605915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerDied","Data":"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7"} Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.681225 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.290831524 podStartE2EDuration="2.681196372s" podCreationTimestamp="2025-12-02 07:47:36 +0000 UTC" firstStartedPulling="2025-12-02 07:47:37.813153773 +0000 UTC m=+1468.984013386" lastFinishedPulling="2025-12-02 07:47:38.203518631 +0000 UTC m=+1469.374378234" observedRunningTime="2025-12-02 07:47:38.621967915 +0000 UTC m=+1469.792827528" watchObservedRunningTime="2025-12-02 07:47:38.681196372 +0000 UTC m=+1469.852055975" Dec 02 07:47:38 crc kubenswrapper[4895]: I1202 07:47:38.691200 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:47:38 crc kubenswrapper[4895]: W1202 07:47:38.700904 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod183c5216_30f9_4f75_865b_7f795ea149fb.slice/crio-45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98 WatchSource:0}: Error finding container 45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98: Status 404 returned error can't find the container with id 45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98 Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.163086 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e54197c-b432-4f6e-9bd9-ce1f15fde624" path="/var/lib/kubelet/pods/1e54197c-b432-4f6e-9bd9-ce1f15fde624/volumes" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.298536 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.428389 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.429974 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430039 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430223 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx4lr\" (UniqueName: \"kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr\") pod \"c61fb3df-f75c-480e-a64c-8436aec04a67\" (UID: \"c61fb3df-f75c-480e-a64c-8436aec04a67\") " Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430508 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.430565 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.431212 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.431251 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61fb3df-f75c-480e-a64c-8436aec04a67-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.435635 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr" (OuterVolumeSpecName: "kube-api-access-zx4lr") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "kube-api-access-zx4lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.437576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts" (OuterVolumeSpecName: "scripts") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.470602 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.511806 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.533226 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.533513 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx4lr\" (UniqueName: \"kubernetes.io/projected/c61fb3df-f75c-480e-a64c-8436aec04a67-kube-api-access-zx4lr\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.533974 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.534045 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.574949 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data" (OuterVolumeSpecName: "config-data") pod "c61fb3df-f75c-480e-a64c-8436aec04a67" (UID: "c61fb3df-f75c-480e-a64c-8436aec04a67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.624225 4895 generic.go:334] "Generic (PLEG): container finished" podID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerID="d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c" exitCode=0 Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.624326 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.624340 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerDied","Data":"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c"} Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.624422 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61fb3df-f75c-480e-a64c-8436aec04a67","Type":"ContainerDied","Data":"b530282eff7a35513a7e9523a1206ab301d90f8c720681bbdfad82daf51a8e46"} Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.624452 4895 scope.go:117] "RemoveContainer" containerID="2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.632233 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183c5216-30f9-4f75-865b-7f795ea149fb","Type":"ContainerStarted","Data":"21f6d09bc2b80b8035a54dfa404bb01cbc6de2843d53dca435681f4b45dafd2f"} Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.632586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183c5216-30f9-4f75-865b-7f795ea149fb","Type":"ContainerStarted","Data":"45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98"} Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.636190 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61fb3df-f75c-480e-a64c-8436aec04a67-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.662328 4895 scope.go:117] "RemoveContainer" containerID="45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.670860 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.670827691 podStartE2EDuration="2.670827691s" podCreationTimestamp="2025-12-02 07:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:39.662958108 +0000 UTC m=+1470.833817741" watchObservedRunningTime="2025-12-02 07:47:39.670827691 +0000 UTC m=+1470.841687304" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.689462 4895 scope.go:117] "RemoveContainer" containerID="d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.690852 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.700781 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.719313 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.719928 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-central-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.719957 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-central-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.719981 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="sg-core" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.719991 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="sg-core" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.720019 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-notification-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720030 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-notification-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.720048 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="proxy-httpd" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720057 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="proxy-httpd" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720595 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-notification-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720620 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="proxy-httpd" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720649 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="sg-core" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.720671 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" containerName="ceilometer-central-agent" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.728933 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.732142 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.732415 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.732584 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.735824 4895 scope.go:117] "RemoveContainer" containerID="db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.743603 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.767025 4895 scope.go:117] "RemoveContainer" containerID="2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.767693 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8\": container with ID starting with 2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8 not found: ID does not exist" containerID="2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.767768 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8"} err="failed to get container status \"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8\": rpc error: code = NotFound desc = could not find container \"2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8\": container with ID starting with 2d44b21df02a1b6d9237fec6ee1533177b8a03efad3b51389b892affcbdbe9c8 not found: ID does not exist" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.767804 4895 scope.go:117] "RemoveContainer" containerID="45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.768162 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525\": container with ID starting with 45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525 not found: ID does not exist" containerID="45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.768207 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525"} err="failed to get container status \"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525\": rpc error: code = NotFound desc = could not find container \"45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525\": container with ID starting with 45a40dd7336d5a5330a9568330163779e86a8ff8d3f644291554642d5d1c1525 not found: ID does not exist" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.768236 4895 scope.go:117] "RemoveContainer" containerID="d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.768451 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c\": container with ID starting with d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c not found: ID does not exist" containerID="d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.768490 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c"} err="failed to get container status \"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c\": rpc error: code = NotFound desc = could not find container \"d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c\": container with ID starting with d97e252a66f722ee7e7efc1d1fd4e6e10074d3adb3119cddaa5537035c4d6b2c not found: ID does not exist" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.768509 4895 scope.go:117] "RemoveContainer" containerID="db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7" Dec 02 07:47:39 crc kubenswrapper[4895]: E1202 07:47:39.768756 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7\": container with ID starting with db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7 not found: ID does not exist" containerID="db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.768780 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7"} err="failed to get container status \"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7\": rpc error: code = NotFound desc = could not find container \"db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7\": container with ID starting with db826e93f54f1a7bd736199df485403bbb1cc0edd174d541d14122e2efef2ee7 not found: ID does not exist" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.831324 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.832789 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.833135 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.835179 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.843387 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.843666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.843856 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.844001 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qpj\" (UniqueName: \"kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.844153 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.844277 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.844430 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.844558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948324 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948396 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948501 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qpj\" (UniqueName: \"kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948674 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.948978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.952032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.953617 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.954218 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.956529 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.957998 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.969766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qpj\" (UniqueName: \"kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:39 crc kubenswrapper[4895]: I1202 07:47:39.977890 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " pod="openstack/ceilometer-0" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.047273 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.575990 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.650133 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerStarted","Data":"68c31a476984f4964e9e4ec22f35ed2cdba52e22d81dc49887989f3b97c486c2"} Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.650969 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.658266 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.828969 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.841732 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.904974 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972459 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972577 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvjf\" (UniqueName: \"kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972772 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:40 crc kubenswrapper[4895]: I1202 07:47:40.972840 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074408 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074446 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074516 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074587 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvjf\" (UniqueName: \"kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.074646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.075669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.076231 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.076717 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.077366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.078023 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.120486 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvjf\" (UniqueName: \"kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf\") pod \"dnsmasq-dns-89c5cd4d5-zx9lx\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.159124 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61fb3df-f75c-480e-a64c-8436aec04a67" path="/var/lib/kubelet/pods/c61fb3df-f75c-480e-a64c-8436aec04a67/volumes" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.181399 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.663519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerStarted","Data":"16a1a0fd0a8e9f66be4f82a00840aca635f3009e5b749fc66ceb9b3d973f3780"} Dec 02 07:47:41 crc kubenswrapper[4895]: I1202 07:47:41.783027 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:47:41 crc kubenswrapper[4895]: W1202 07:47:41.809823 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5f33d2_0416_40b5_8133_324aa1a60118.slice/crio-fc7eddce0ba5ae5098676a1f5c6537ff965b82512c0225e16f0345065f6c2e87 WatchSource:0}: Error finding container fc7eddce0ba5ae5098676a1f5c6537ff965b82512c0225e16f0345065f6c2e87: Status 404 returned error can't find the container with id fc7eddce0ba5ae5098676a1f5c6537ff965b82512c0225e16f0345065f6c2e87 Dec 02 07:47:42 crc kubenswrapper[4895]: I1202 07:47:42.678808 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerStarted","Data":"c0f088ef401618517e67e28e5a328966463dc38f136b4491562d6b992d975022"} Dec 02 07:47:42 crc kubenswrapper[4895]: I1202 07:47:42.686760 4895 generic.go:334] "Generic (PLEG): container finished" podID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerID="5d46ad9baa196f7d343ee2c2039ebc3c7e53ef6e2da357137a9c5ae843ce8ef6" exitCode=0 Dec 02 07:47:42 crc kubenswrapper[4895]: I1202 07:47:42.688413 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" event={"ID":"9c5f33d2-0416-40b5-8133-324aa1a60118","Type":"ContainerDied","Data":"5d46ad9baa196f7d343ee2c2039ebc3c7e53ef6e2da357137a9c5ae843ce8ef6"} Dec 02 07:47:42 crc kubenswrapper[4895]: I1202 07:47:42.688451 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" event={"ID":"9c5f33d2-0416-40b5-8133-324aa1a60118","Type":"ContainerStarted","Data":"fc7eddce0ba5ae5098676a1f5c6537ff965b82512c0225e16f0345065f6c2e87"} Dec 02 07:47:43 crc kubenswrapper[4895]: I1202 07:47:43.341393 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:43 crc kubenswrapper[4895]: I1202 07:47:43.710689 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerStarted","Data":"2fe7ea32536893137c42d8ec1e356b1446b1785f0a21cd68dcb97234eadd2da5"} Dec 02 07:47:43 crc kubenswrapper[4895]: I1202 07:47:43.713422 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" event={"ID":"9c5f33d2-0416-40b5-8133-324aa1a60118","Type":"ContainerStarted","Data":"9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194"} Dec 02 07:47:43 crc kubenswrapper[4895]: I1202 07:47:43.714970 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:43 crc kubenswrapper[4895]: I1202 07:47:43.750942 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" podStartSLOduration=3.750915234 podStartE2EDuration="3.750915234s" podCreationTimestamp="2025-12-02 07:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:43.738022377 +0000 UTC m=+1474.908881990" watchObservedRunningTime="2025-12-02 07:47:43.750915234 +0000 UTC m=+1474.921774857" Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.125189 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.395998 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.396635 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-log" containerID="cri-o://6c75181469b7e4e874c6cb08cc5589b4e51ef084d9a3f044bd9ad3e485bec0ff" gracePeriod=30 Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.396730 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-api" containerID="cri-o://9c6c4a34755af17a63fa22b1a3b2be0d498dc4cbba5ef1a8c1c29481bc2c0d85" gracePeriod=30 Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.724662 4895 generic.go:334] "Generic (PLEG): container finished" podID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerID="6c75181469b7e4e874c6cb08cc5589b4e51ef084d9a3f044bd9ad3e485bec0ff" exitCode=143 Dec 02 07:47:44 crc kubenswrapper[4895]: I1202 07:47:44.725791 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerDied","Data":"6c75181469b7e4e874c6cb08cc5589b4e51ef084d9a3f044bd9ad3e485bec0ff"} Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.738947 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerStarted","Data":"904262a3f6ba5e51d0fe3d22851d61e7a57ab99795d3be21d8f3e1987e167054"} Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.738987 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-central-agent" containerID="cri-o://16a1a0fd0a8e9f66be4f82a00840aca635f3009e5b749fc66ceb9b3d973f3780" gracePeriod=30 Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.739254 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="proxy-httpd" containerID="cri-o://904262a3f6ba5e51d0fe3d22851d61e7a57ab99795d3be21d8f3e1987e167054" gracePeriod=30 Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.739276 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="sg-core" containerID="cri-o://2fe7ea32536893137c42d8ec1e356b1446b1785f0a21cd68dcb97234eadd2da5" gracePeriod=30 Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.739291 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-notification-agent" containerID="cri-o://c0f088ef401618517e67e28e5a328966463dc38f136b4491562d6b992d975022" gracePeriod=30 Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.739702 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:47:45 crc kubenswrapper[4895]: I1202 07:47:45.768874 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.765969673 podStartE2EDuration="6.768857085s" podCreationTimestamp="2025-12-02 07:47:39 +0000 UTC" firstStartedPulling="2025-12-02 07:47:40.586282393 +0000 UTC m=+1471.757142006" lastFinishedPulling="2025-12-02 07:47:44.589169805 +0000 UTC m=+1475.760029418" observedRunningTime="2025-12-02 07:47:45.765530973 +0000 UTC m=+1476.936390606" watchObservedRunningTime="2025-12-02 07:47:45.768857085 +0000 UTC m=+1476.939716698" Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.753618 4895 generic.go:334] "Generic (PLEG): container finished" podID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerID="904262a3f6ba5e51d0fe3d22851d61e7a57ab99795d3be21d8f3e1987e167054" exitCode=0 Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.754840 4895 generic.go:334] "Generic (PLEG): container finished" podID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerID="2fe7ea32536893137c42d8ec1e356b1446b1785f0a21cd68dcb97234eadd2da5" exitCode=2 Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.754949 4895 generic.go:334] "Generic (PLEG): container finished" podID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerID="c0f088ef401618517e67e28e5a328966463dc38f136b4491562d6b992d975022" exitCode=0 Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.753678 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerDied","Data":"904262a3f6ba5e51d0fe3d22851d61e7a57ab99795d3be21d8f3e1987e167054"} Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.755128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerDied","Data":"2fe7ea32536893137c42d8ec1e356b1446b1785f0a21cd68dcb97234eadd2da5"} Dec 02 07:47:46 crc kubenswrapper[4895]: I1202 07:47:46.755191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerDied","Data":"c0f088ef401618517e67e28e5a328966463dc38f136b4491562d6b992d975022"} Dec 02 07:47:47 crc kubenswrapper[4895]: I1202 07:47:47.277889 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 07:47:47 crc kubenswrapper[4895]: I1202 07:47:47.769656 4895 generic.go:334] "Generic (PLEG): container finished" podID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerID="9c6c4a34755af17a63fa22b1a3b2be0d498dc4cbba5ef1a8c1c29481bc2c0d85" exitCode=0 Dec 02 07:47:47 crc kubenswrapper[4895]: I1202 07:47:47.769845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerDied","Data":"9c6c4a34755af17a63fa22b1a3b2be0d498dc4cbba5ef1a8c1c29481bc2c0d85"} Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.061917 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.159356 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data\") pod \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.159492 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d9b8\" (UniqueName: \"kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8\") pod \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.177131 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8" (OuterVolumeSpecName: "kube-api-access-4d9b8") pod "de57c107-7f7e-4d49-a986-9ffc3ca3d828" (UID: "de57c107-7f7e-4d49-a986-9ffc3ca3d828"). InnerVolumeSpecName "kube-api-access-4d9b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.206956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data" (OuterVolumeSpecName: "config-data") pod "de57c107-7f7e-4d49-a986-9ffc3ca3d828" (UID: "de57c107-7f7e-4d49-a986-9ffc3ca3d828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.262161 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs\") pod \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.262250 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle\") pod \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\" (UID: \"de57c107-7f7e-4d49-a986-9ffc3ca3d828\") " Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.262990 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs" (OuterVolumeSpecName: "logs") pod "de57c107-7f7e-4d49-a986-9ffc3ca3d828" (UID: "de57c107-7f7e-4d49-a986-9ffc3ca3d828"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.263041 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d9b8\" (UniqueName: \"kubernetes.io/projected/de57c107-7f7e-4d49-a986-9ffc3ca3d828-kube-api-access-4d9b8\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.263063 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.298816 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de57c107-7f7e-4d49-a986-9ffc3ca3d828" (UID: "de57c107-7f7e-4d49-a986-9ffc3ca3d828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.340845 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.365461 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de57c107-7f7e-4d49-a986-9ffc3ca3d828-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.365497 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de57c107-7f7e-4d49-a986-9ffc3ca3d828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.366224 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.783834 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de57c107-7f7e-4d49-a986-9ffc3ca3d828","Type":"ContainerDied","Data":"2db030db2c17b158bcafd98741b114d8b463b02291270ba8b658c53b266309e3"} Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.783906 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.783942 4895 scope.go:117] "RemoveContainer" containerID="9c6c4a34755af17a63fa22b1a3b2be0d498dc4cbba5ef1a8c1c29481bc2c0d85" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.805932 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.819859 4895 scope.go:117] "RemoveContainer" containerID="6c75181469b7e4e874c6cb08cc5589b4e51ef084d9a3f044bd9ad3e485bec0ff" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.858450 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.870179 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.914788 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:48 crc kubenswrapper[4895]: E1202 07:47:48.915348 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-log" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.915370 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-log" Dec 02 07:47:48 crc kubenswrapper[4895]: E1202 07:47:48.915391 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-api" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.915401 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-api" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.915660 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-log" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.915690 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" containerName="nova-api-api" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.917057 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.920255 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.920515 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.920681 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.940565 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.977354 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.977463 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.977728 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbzs\" (UniqueName: \"kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.977819 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.977887 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:48 crc kubenswrapper[4895]: I1202 07:47:48.978076 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.031698 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tx8bx"] Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.033198 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.036071 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.036359 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.043991 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tx8bx"] Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080763 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080843 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080891 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66k2\" (UniqueName: \"kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbzs\" (UniqueName: \"kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080976 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.080996 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.081013 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.081058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.081087 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.081119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.081981 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.085199 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.086040 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.087438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.087725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.101193 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbzs\" (UniqueName: \"kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs\") pod \"nova-api-0\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.155581 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de57c107-7f7e-4d49-a986-9ffc3ca3d828" path="/var/lib/kubelet/pods/de57c107-7f7e-4d49-a986-9ffc3ca3d828/volumes" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.182645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66k2\" (UniqueName: \"kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.182755 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.182803 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.182834 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.187588 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.187987 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.190875 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.205289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66k2\" (UniqueName: \"kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2\") pod \"nova-cell1-cell-mapping-tx8bx\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.243953 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.353711 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.730866 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.843930 4895 generic.go:334] "Generic (PLEG): container finished" podID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerID="16a1a0fd0a8e9f66be4f82a00840aca635f3009e5b749fc66ceb9b3d973f3780" exitCode=0 Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.844407 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerDied","Data":"16a1a0fd0a8e9f66be4f82a00840aca635f3009e5b749fc66ceb9b3d973f3780"} Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.855517 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerStarted","Data":"635ec56c1f9f7f91cfd1ef06db5578c51136718ee22d322ff387f9627a4b9131"} Dec 02 07:47:49 crc kubenswrapper[4895]: I1202 07:47:49.867547 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tx8bx"] Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.024105 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119312 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119388 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119488 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4qpj\" (UniqueName: \"kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119567 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119806 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119936 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.119968 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts\") pod \"54343b4a-8625-4b23-9d95-d67c5d649f3e\" (UID: \"54343b4a-8625-4b23-9d95-d67c5d649f3e\") " Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.120028 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.120343 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.121171 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.121195 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54343b4a-8625-4b23-9d95-d67c5d649f3e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.124708 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj" (OuterVolumeSpecName: "kube-api-access-b4qpj") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "kube-api-access-b4qpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.125049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts" (OuterVolumeSpecName: "scripts") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.165572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.198267 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.221111 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.223031 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.223066 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.223076 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.223087 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4qpj\" (UniqueName: \"kubernetes.io/projected/54343b4a-8625-4b23-9d95-d67c5d649f3e-kube-api-access-b4qpj\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.223101 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.259205 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data" (OuterVolumeSpecName: "config-data") pod "54343b4a-8625-4b23-9d95-d67c5d649f3e" (UID: "54343b4a-8625-4b23-9d95-d67c5d649f3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.350095 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54343b4a-8625-4b23-9d95-d67c5d649f3e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.873607 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54343b4a-8625-4b23-9d95-d67c5d649f3e","Type":"ContainerDied","Data":"68c31a476984f4964e9e4ec22f35ed2cdba52e22d81dc49887989f3b97c486c2"} Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.874134 4895 scope.go:117] "RemoveContainer" containerID="904262a3f6ba5e51d0fe3d22851d61e7a57ab99795d3be21d8f3e1987e167054" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.874407 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.876496 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerStarted","Data":"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579"} Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.876577 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerStarted","Data":"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf"} Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.882216 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tx8bx" event={"ID":"4f3db479-516c-46b3-881d-7021a15c7a7d","Type":"ContainerStarted","Data":"c3e6869faec7cbcaa9cf951bb70a1cabb5f7864bf03d30d0b3bda1d511028024"} Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.882266 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tx8bx" event={"ID":"4f3db479-516c-46b3-881d-7021a15c7a7d","Type":"ContainerStarted","Data":"d22ada04275bb2177da11a1af82a3af23bcb4276f5f9275d236b5b399d5cfd79"} Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.899317 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.899289431 podStartE2EDuration="2.899289431s" podCreationTimestamp="2025-12-02 07:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:50.896874247 +0000 UTC m=+1482.067733880" watchObservedRunningTime="2025-12-02 07:47:50.899289431 +0000 UTC m=+1482.070149064" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.923995 4895 scope.go:117] "RemoveContainer" containerID="2fe7ea32536893137c42d8ec1e356b1446b1785f0a21cd68dcb97234eadd2da5" Dec 02 07:47:50 crc kubenswrapper[4895]: I1202 07:47:50.926307 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tx8bx" podStartSLOduration=1.926276293 podStartE2EDuration="1.926276293s" podCreationTimestamp="2025-12-02 07:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:50.919203575 +0000 UTC m=+1482.090063198" watchObservedRunningTime="2025-12-02 07:47:50.926276293 +0000 UTC m=+1482.097135946" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.014945 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.039733 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.050401 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:51 crc kubenswrapper[4895]: E1202 07:47:51.051827 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="proxy-httpd" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.051845 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="proxy-httpd" Dec 02 07:47:51 crc kubenswrapper[4895]: E1202 07:47:51.051862 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="sg-core" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.051868 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="sg-core" Dec 02 07:47:51 crc kubenswrapper[4895]: E1202 07:47:51.051885 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-notification-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.051893 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-notification-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: E1202 07:47:51.051923 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-central-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.051928 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-central-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.052329 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-notification-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.052377 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="ceilometer-central-agent" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.052397 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="sg-core" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.052409 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" containerName="proxy-httpd" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.055619 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.067669 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.068330 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.079132 4895 scope.go:117] "RemoveContainer" containerID="c0f088ef401618517e67e28e5a328966463dc38f136b4491562d6b992d975022" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.070222 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.096323 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.163867 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54343b4a-8625-4b23-9d95-d67c5d649f3e" path="/var/lib/kubelet/pods/54343b4a-8625-4b23-9d95-d67c5d649f3e/volumes" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.176910 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177063 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177110 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbsnz\" (UniqueName: \"kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177174 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177200 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.177341 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.190925 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.199843 4895 scope.go:117] "RemoveContainer" containerID="16a1a0fd0a8e9f66be4f82a00840aca635f3009e5b749fc66ceb9b3d973f3780" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280257 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280394 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280472 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280576 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbsnz\" (UniqueName: \"kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.280642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.282417 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.282593 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.289860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.291113 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.291445 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="dnsmasq-dns" containerID="cri-o://dcb07c6666b40670ddd6c822e6250b79b03ff7717ddff952acab4a55d8c1719d" gracePeriod=10 Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.292883 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.297723 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.298919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.311838 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbsnz\" (UniqueName: \"kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.315725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.423320 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.904954 4895 generic.go:334] "Generic (PLEG): container finished" podID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerID="dcb07c6666b40670ddd6c822e6250b79b03ff7717ddff952acab4a55d8c1719d" exitCode=0 Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.905343 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" event={"ID":"24560b66-b4b2-4c54-98f4-2dbf30465373","Type":"ContainerDied","Data":"dcb07c6666b40670ddd6c822e6250b79b03ff7717ddff952acab4a55d8c1719d"} Dec 02 07:47:51 crc kubenswrapper[4895]: I1202 07:47:51.998672 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:47:52 crc kubenswrapper[4895]: W1202 07:47:52.009415 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4b0ee49_bed2_4691_8160_2edbebda27b7.slice/crio-cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc WatchSource:0}: Error finding container cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc: Status 404 returned error can't find the container with id cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.117946 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202346 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202422 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202623 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202643 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hkvh\" (UniqueName: \"kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202777 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.202817 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0\") pod \"24560b66-b4b2-4c54-98f4-2dbf30465373\" (UID: \"24560b66-b4b2-4c54-98f4-2dbf30465373\") " Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.223279 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh" (OuterVolumeSpecName: "kube-api-access-2hkvh") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "kube-api-access-2hkvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.262510 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.266162 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config" (OuterVolumeSpecName: "config") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.266559 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.282944 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.287072 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24560b66-b4b2-4c54-98f4-2dbf30465373" (UID: "24560b66-b4b2-4c54-98f4-2dbf30465373"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305217 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305270 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hkvh\" (UniqueName: \"kubernetes.io/projected/24560b66-b4b2-4c54-98f4-2dbf30465373-kube-api-access-2hkvh\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305285 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305300 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305313 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.305325 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24560b66-b4b2-4c54-98f4-2dbf30465373-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.922380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" event={"ID":"24560b66-b4b2-4c54-98f4-2dbf30465373","Type":"ContainerDied","Data":"edaa950854b60d97234a76611f7b13a4389f65fb424dbe1174ecb06f9146263f"} Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.922911 4895 scope.go:117] "RemoveContainer" containerID="dcb07c6666b40670ddd6c822e6250b79b03ff7717ddff952acab4a55d8c1719d" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.922494 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-zhzzg" Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.925103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerStarted","Data":"a18e722962390f2024c510ade1f26e4551f58f4c4c7c9b941662a44001c505ea"} Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.925140 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerStarted","Data":"cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc"} Dec 02 07:47:52 crc kubenswrapper[4895]: I1202 07:47:52.949904 4895 scope.go:117] "RemoveContainer" containerID="5590d81d23ba137905161fc2b52460b69642aadd53f83964f3cd2617bf4b4959" Dec 02 07:47:53 crc kubenswrapper[4895]: I1202 07:47:53.090155 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:53 crc kubenswrapper[4895]: I1202 07:47:53.100723 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-zhzzg"] Dec 02 07:47:53 crc kubenswrapper[4895]: I1202 07:47:53.153769 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" path="/var/lib/kubelet/pods/24560b66-b4b2-4c54-98f4-2dbf30465373/volumes" Dec 02 07:47:53 crc kubenswrapper[4895]: I1202 07:47:53.940554 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerStarted","Data":"9b1129b02fb76389880616b0a4f07ba64c625c630960d277f41251bdc884c35b"} Dec 02 07:47:54 crc kubenswrapper[4895]: I1202 07:47:54.956646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerStarted","Data":"24c551cd8bbb34832b5693a91b97f7fc6d801619091d62b54a02c1b5f9bcbd45"} Dec 02 07:47:56 crc kubenswrapper[4895]: I1202 07:47:56.984251 4895 generic.go:334] "Generic (PLEG): container finished" podID="4f3db479-516c-46b3-881d-7021a15c7a7d" containerID="c3e6869faec7cbcaa9cf951bb70a1cabb5f7864bf03d30d0b3bda1d511028024" exitCode=0 Dec 02 07:47:56 crc kubenswrapper[4895]: I1202 07:47:56.984371 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tx8bx" event={"ID":"4f3db479-516c-46b3-881d-7021a15c7a7d","Type":"ContainerDied","Data":"c3e6869faec7cbcaa9cf951bb70a1cabb5f7864bf03d30d0b3bda1d511028024"} Dec 02 07:47:56 crc kubenswrapper[4895]: I1202 07:47:56.990173 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerStarted","Data":"c51c9cadb8af000c2a708fd441d7a16102397aef9d4301d9ddb87d8386fc6024"} Dec 02 07:47:56 crc kubenswrapper[4895]: I1202 07:47:56.990415 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 07:47:57 crc kubenswrapper[4895]: I1202 07:47:57.028296 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.319400759 podStartE2EDuration="7.028272845s" podCreationTimestamp="2025-12-02 07:47:50 +0000 UTC" firstStartedPulling="2025-12-02 07:47:52.013794241 +0000 UTC m=+1483.184653854" lastFinishedPulling="2025-12-02 07:47:55.722666327 +0000 UTC m=+1486.893525940" observedRunningTime="2025-12-02 07:47:57.022151765 +0000 UTC m=+1488.193011398" watchObservedRunningTime="2025-12-02 07:47:57.028272845 +0000 UTC m=+1488.199132458" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.497108 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.552540 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle\") pod \"4f3db479-516c-46b3-881d-7021a15c7a7d\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.552771 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts\") pod \"4f3db479-516c-46b3-881d-7021a15c7a7d\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.552809 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r66k2\" (UniqueName: \"kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2\") pod \"4f3db479-516c-46b3-881d-7021a15c7a7d\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.552992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data\") pod \"4f3db479-516c-46b3-881d-7021a15c7a7d\" (UID: \"4f3db479-516c-46b3-881d-7021a15c7a7d\") " Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.561385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2" (OuterVolumeSpecName: "kube-api-access-r66k2") pod "4f3db479-516c-46b3-881d-7021a15c7a7d" (UID: "4f3db479-516c-46b3-881d-7021a15c7a7d"). InnerVolumeSpecName "kube-api-access-r66k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.563110 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts" (OuterVolumeSpecName: "scripts") pod "4f3db479-516c-46b3-881d-7021a15c7a7d" (UID: "4f3db479-516c-46b3-881d-7021a15c7a7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.603771 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3db479-516c-46b3-881d-7021a15c7a7d" (UID: "4f3db479-516c-46b3-881d-7021a15c7a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.604623 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data" (OuterVolumeSpecName: "config-data") pod "4f3db479-516c-46b3-881d-7021a15c7a7d" (UID: "4f3db479-516c-46b3-881d-7021a15c7a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.655904 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.655937 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.655949 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f3db479-516c-46b3-881d-7021a15c7a7d-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:58 crc kubenswrapper[4895]: I1202 07:47:58.655958 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r66k2\" (UniqueName: \"kubernetes.io/projected/4f3db479-516c-46b3-881d-7021a15c7a7d-kube-api-access-r66k2\") on node \"crc\" DevicePath \"\"" Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.016186 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tx8bx" event={"ID":"4f3db479-516c-46b3-881d-7021a15c7a7d","Type":"ContainerDied","Data":"d22ada04275bb2177da11a1af82a3af23bcb4276f5f9275d236b5b399d5cfd79"} Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.016259 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22ada04275bb2177da11a1af82a3af23bcb4276f5f9275d236b5b399d5cfd79" Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.016760 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tx8bx" Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.229263 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.229965 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-log" containerID="cri-o://6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" gracePeriod=30 Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.230127 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-api" containerID="cri-o://781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" gracePeriod=30 Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.294958 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.295494 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2c347298-8afb-4598-ba75-64cd23db0935" containerName="nova-scheduler-scheduler" containerID="cri-o://1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" gracePeriod=30 Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.321413 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.321755 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" containerID="cri-o://7a93b7b1fa5e4df6ef9a5fd45c7bfb4f2453204ed865bcce093ea30558cc6ba4" gracePeriod=30 Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.321924 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" containerID="cri-o://d33e2d232e3b83713b4b5bc7098bf0d68f44aaac578ecdb93f62f57e2d517660" gracePeriod=30 Dec 02 07:47:59 crc kubenswrapper[4895]: I1202 07:47:59.905010 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005386 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005521 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005594 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005678 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005824 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbzs\" (UniqueName: \"kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.005865 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs\") pod \"f9905011-be22-4d3c-89f2-dd677c65b8a1\" (UID: \"f9905011-be22-4d3c-89f2-dd677c65b8a1\") " Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.006931 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs" (OuterVolumeSpecName: "logs") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.026149 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs" (OuterVolumeSpecName: "kube-api-access-ssbzs") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "kube-api-access-ssbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.047156 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data" (OuterVolumeSpecName: "config-data") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.047833 4895 generic.go:334] "Generic (PLEG): container finished" podID="23ebd00f-3697-43da-a335-42d260a62237" containerID="7a93b7b1fa5e4df6ef9a5fd45c7bfb4f2453204ed865bcce093ea30558cc6ba4" exitCode=143 Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.047914 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerDied","Data":"7a93b7b1fa5e4df6ef9a5fd45c7bfb4f2453204ed865bcce093ea30558cc6ba4"} Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051061 4895 generic.go:334] "Generic (PLEG): container finished" podID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerID="781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" exitCode=0 Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051076 4895 generic.go:334] "Generic (PLEG): container finished" podID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerID="6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" exitCode=143 Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerDied","Data":"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579"} Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051109 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerDied","Data":"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf"} Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051118 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9905011-be22-4d3c-89f2-dd677c65b8a1","Type":"ContainerDied","Data":"635ec56c1f9f7f91cfd1ef06db5578c51136718ee22d322ff387f9627a4b9131"} Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051135 4895 scope.go:117] "RemoveContainer" containerID="781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.051272 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.062966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.081511 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.081950 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f9905011-be22-4d3c-89f2-dd677c65b8a1" (UID: "f9905011-be22-4d3c-89f2-dd677c65b8a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.087444 4895 scope.go:117] "RemoveContainer" containerID="6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108558 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108634 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108645 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbzs\" (UniqueName: \"kubernetes.io/projected/f9905011-be22-4d3c-89f2-dd677c65b8a1-kube-api-access-ssbzs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108655 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9905011-be22-4d3c-89f2-dd677c65b8a1-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108666 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.108676 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9905011-be22-4d3c-89f2-dd677c65b8a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.116437 4895 scope.go:117] "RemoveContainer" containerID="781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.117025 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579\": container with ID starting with 781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579 not found: ID does not exist" containerID="781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.117084 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579"} err="failed to get container status \"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579\": rpc error: code = NotFound desc = could not find container \"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579\": container with ID starting with 781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579 not found: ID does not exist" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.117113 4895 scope.go:117] "RemoveContainer" containerID="6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.117637 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf\": container with ID starting with 6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf not found: ID does not exist" containerID="6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.117665 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf"} err="failed to get container status \"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf\": rpc error: code = NotFound desc = could not find container \"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf\": container with ID starting with 6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf not found: ID does not exist" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.117682 4895 scope.go:117] "RemoveContainer" containerID="781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.118249 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579"} err="failed to get container status \"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579\": rpc error: code = NotFound desc = could not find container \"781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579\": container with ID starting with 781691fcc2bea33451a07f5b4ba60cca495934f5e25f3b0b263b08c62ac11579 not found: ID does not exist" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.118298 4895 scope.go:117] "RemoveContainer" containerID="6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.118606 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf"} err="failed to get container status \"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf\": rpc error: code = NotFound desc = could not find container \"6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf\": container with ID starting with 6a00169a318f763610e293363ab886d61a2e5a9459f9784fee212602fb5080bf not found: ID does not exist" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.427554 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.449054 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.464832 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.465467 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="dnsmasq-dns" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.465504 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="dnsmasq-dns" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.465551 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="init" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.465564 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="init" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.465597 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3db479-516c-46b3-881d-7021a15c7a7d" containerName="nova-manage" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.465608 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3db479-516c-46b3-881d-7021a15c7a7d" containerName="nova-manage" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.465634 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-api" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.465645 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-api" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.465675 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-log" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.465686 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-log" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.466061 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-log" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.466096 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3db479-516c-46b3-881d-7021a15c7a7d" containerName="nova-manage" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.466108 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="24560b66-b4b2-4c54-98f4-2dbf30465373" containerName="dnsmasq-dns" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.466160 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" containerName="nova-api-api" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.467666 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.473342 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.473468 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.488358 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.491362 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518185 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518338 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jh6f\" (UniqueName: \"kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518428 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518661 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.518692 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.620735 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jh6f\" (UniqueName: \"kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.621134 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.621402 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.621489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.621634 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.621753 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.622435 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.627985 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.629323 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.633378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.638634 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.642576 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jh6f\" (UniqueName: \"kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f\") pod \"nova-api-0\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " pod="openstack/nova-api-0" Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.779019 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.781032 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.784949 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:00 crc kubenswrapper[4895]: E1202 07:48:00.785034 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2c347298-8afb-4598-ba75-64cd23db0935" containerName="nova-scheduler-scheduler" Dec 02 07:48:00 crc kubenswrapper[4895]: I1202 07:48:00.831871 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:01 crc kubenswrapper[4895]: I1202 07:48:01.163180 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9905011-be22-4d3c-89f2-dd677c65b8a1" path="/var/lib/kubelet/pods/f9905011-be22-4d3c-89f2-dd677c65b8a1/volumes" Dec 02 07:48:01 crc kubenswrapper[4895]: I1202 07:48:01.345215 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.081042 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerStarted","Data":"e21126490e30d0f2abdaa9c6468d800825eee8d11c80c4baee6ce5e501917408"} Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.081599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerStarted","Data":"fc3f6c60c5579c4ea14cacedf2c65b1a8d013562a1bf58b34c7f59f9c0b1bdbe"} Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.081624 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerStarted","Data":"58bc85987b637dd3201cd07bd859b57218cf7cf9d0e7867f0d422f70d7b00677"} Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.123059 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.123012694 podStartE2EDuration="2.123012694s" podCreationTimestamp="2025-12-02 07:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:02.110212507 +0000 UTC m=+1493.281072190" watchObservedRunningTime="2025-12-02 07:48:02.123012694 +0000 UTC m=+1493.293872327" Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.741617 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:42274->10.217.0.188:8775: read: connection reset by peer" Dec 02 07:48:02 crc kubenswrapper[4895]: I1202 07:48:02.741631 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:42288->10.217.0.188:8775: read: connection reset by peer" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.102895 4895 generic.go:334] "Generic (PLEG): container finished" podID="23ebd00f-3697-43da-a335-42d260a62237" containerID="d33e2d232e3b83713b4b5bc7098bf0d68f44aaac578ecdb93f62f57e2d517660" exitCode=0 Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.105315 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerDied","Data":"d33e2d232e3b83713b4b5bc7098bf0d68f44aaac578ecdb93f62f57e2d517660"} Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.306438 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.397505 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blbqv\" (UniqueName: \"kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv\") pod \"23ebd00f-3697-43da-a335-42d260a62237\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.397622 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data\") pod \"23ebd00f-3697-43da-a335-42d260a62237\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.397715 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs\") pod \"23ebd00f-3697-43da-a335-42d260a62237\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.397930 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs\") pod \"23ebd00f-3697-43da-a335-42d260a62237\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.397963 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle\") pod \"23ebd00f-3697-43da-a335-42d260a62237\" (UID: \"23ebd00f-3697-43da-a335-42d260a62237\") " Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.402044 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs" (OuterVolumeSpecName: "logs") pod "23ebd00f-3697-43da-a335-42d260a62237" (UID: "23ebd00f-3697-43da-a335-42d260a62237"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.409518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv" (OuterVolumeSpecName: "kube-api-access-blbqv") pod "23ebd00f-3697-43da-a335-42d260a62237" (UID: "23ebd00f-3697-43da-a335-42d260a62237"). InnerVolumeSpecName "kube-api-access-blbqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.450361 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23ebd00f-3697-43da-a335-42d260a62237" (UID: "23ebd00f-3697-43da-a335-42d260a62237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.453104 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data" (OuterVolumeSpecName: "config-data") pod "23ebd00f-3697-43da-a335-42d260a62237" (UID: "23ebd00f-3697-43da-a335-42d260a62237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.501339 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blbqv\" (UniqueName: \"kubernetes.io/projected/23ebd00f-3697-43da-a335-42d260a62237-kube-api-access-blbqv\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.501380 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.501403 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ebd00f-3697-43da-a335-42d260a62237-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.501415 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.543102 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "23ebd00f-3697-43da-a335-42d260a62237" (UID: "23ebd00f-3697-43da-a335-42d260a62237"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:03 crc kubenswrapper[4895]: I1202 07:48:03.603726 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ebd00f-3697-43da-a335-42d260a62237-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.133143 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"23ebd00f-3697-43da-a335-42d260a62237","Type":"ContainerDied","Data":"a035b29b4053c54ec1c26c456cf2504604fab837bd6d2ac03adce178b1636081"} Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.133219 4895 scope.go:117] "RemoveContainer" containerID="d33e2d232e3b83713b4b5bc7098bf0d68f44aaac578ecdb93f62f57e2d517660" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.133272 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.171354 4895 scope.go:117] "RemoveContainer" containerID="7a93b7b1fa5e4df6ef9a5fd45c7bfb4f2453204ed865bcce093ea30558cc6ba4" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.193295 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.206602 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.218308 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:04 crc kubenswrapper[4895]: E1202 07:48:04.218800 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.218815 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" Dec 02 07:48:04 crc kubenswrapper[4895]: E1202 07:48:04.218840 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.218847 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.219076 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-metadata" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.219095 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ebd00f-3697-43da-a335-42d260a62237" containerName="nova-metadata-log" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.220270 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.226850 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.227120 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.232165 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.318040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.318098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdqn2\" (UniqueName: \"kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.318131 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.318228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.318324 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421018 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421139 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421191 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdqn2\" (UniqueName: \"kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421263 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.421775 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.427494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.428238 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.429641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.444980 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdqn2\" (UniqueName: \"kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2\") pod \"nova-metadata-0\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " pod="openstack/nova-metadata-0" Dec 02 07:48:04 crc kubenswrapper[4895]: I1202 07:48:04.545293 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:05 crc kubenswrapper[4895]: E1202 07:48:05.013429 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c347298_8afb_4598_ba75_64cd23db0935.slice/crio-conmon-1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c347298_8afb_4598_ba75_64cd23db0935.slice/crio-1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29.scope\": RecentStats: unable to find data in memory cache]" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.069416 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.161379 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c347298-8afb-4598-ba75-64cd23db0935" containerID="1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" exitCode=0 Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.173196 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ebd00f-3697-43da-a335-42d260a62237" path="/var/lib/kubelet/pods/23ebd00f-3697-43da-a335-42d260a62237/volumes" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.174354 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerStarted","Data":"ca9480eb873cb42c7438f27a0592e23bd26270b5d0d893aa3b2e61758d9f0968"} Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.174399 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c347298-8afb-4598-ba75-64cd23db0935","Type":"ContainerDied","Data":"1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29"} Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.243296 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.350854 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle\") pod \"2c347298-8afb-4598-ba75-64cd23db0935\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.350942 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data\") pod \"2c347298-8afb-4598-ba75-64cd23db0935\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.351090 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mt4l\" (UniqueName: \"kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l\") pod \"2c347298-8afb-4598-ba75-64cd23db0935\" (UID: \"2c347298-8afb-4598-ba75-64cd23db0935\") " Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.358540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l" (OuterVolumeSpecName: "kube-api-access-2mt4l") pod "2c347298-8afb-4598-ba75-64cd23db0935" (UID: "2c347298-8afb-4598-ba75-64cd23db0935"). InnerVolumeSpecName "kube-api-access-2mt4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.433650 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c347298-8afb-4598-ba75-64cd23db0935" (UID: "2c347298-8afb-4598-ba75-64cd23db0935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.433706 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data" (OuterVolumeSpecName: "config-data") pod "2c347298-8afb-4598-ba75-64cd23db0935" (UID: "2c347298-8afb-4598-ba75-64cd23db0935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.453559 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.453631 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c347298-8afb-4598-ba75-64cd23db0935-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.453641 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mt4l\" (UniqueName: \"kubernetes.io/projected/2c347298-8afb-4598-ba75-64cd23db0935-kube-api-access-2mt4l\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.473109 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:48:05 crc kubenswrapper[4895]: I1202 07:48:05.473174 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.180698 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerStarted","Data":"1581cb4c4b70dcc4008550020a88177eb72fd5b2057dc2f0204082b9090480c2"} Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.181560 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerStarted","Data":"7a9cd5ea2cd01d61f6bb76eff238c970ae03c6d9c57bfc437465a95ac614529c"} Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.182957 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2c347298-8afb-4598-ba75-64cd23db0935","Type":"ContainerDied","Data":"40ff0a58a2c649fb977b6d4a2e60dda4b909165d9ab04fab3df6a841dca7625b"} Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.183017 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.183017 4895 scope.go:117] "RemoveContainer" containerID="1715524a082c44f6e32e19ecd9c82eadacab0180c80d32d488c4380c16af9a29" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.206879 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.206557925 podStartE2EDuration="2.206557925s" podCreationTimestamp="2025-12-02 07:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:06.203094927 +0000 UTC m=+1497.373954610" watchObservedRunningTime="2025-12-02 07:48:06.206557925 +0000 UTC m=+1497.377417548" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.246114 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.272841 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.290827 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:06 crc kubenswrapper[4895]: E1202 07:48:06.291554 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c347298-8afb-4598-ba75-64cd23db0935" containerName="nova-scheduler-scheduler" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.291589 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c347298-8afb-4598-ba75-64cd23db0935" containerName="nova-scheduler-scheduler" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.292047 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c347298-8afb-4598-ba75-64cd23db0935" containerName="nova-scheduler-scheduler" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.293284 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.301239 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.307959 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.372970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.373031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.373062 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9jt\" (UniqueName: \"kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.475827 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.475972 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.476057 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9jt\" (UniqueName: \"kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.484351 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.500494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.511806 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9jt\" (UniqueName: \"kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt\") pod \"nova-scheduler-0\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " pod="openstack/nova-scheduler-0" Dec 02 07:48:06 crc kubenswrapper[4895]: I1202 07:48:06.619121 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:07 crc kubenswrapper[4895]: I1202 07:48:07.160175 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c347298-8afb-4598-ba75-64cd23db0935" path="/var/lib/kubelet/pods/2c347298-8afb-4598-ba75-64cd23db0935/volumes" Dec 02 07:48:07 crc kubenswrapper[4895]: I1202 07:48:07.182350 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:08 crc kubenswrapper[4895]: I1202 07:48:08.212409 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d08915b6-6f79-40e4-8c26-d9f82606b4cc","Type":"ContainerStarted","Data":"cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5"} Dec 02 07:48:08 crc kubenswrapper[4895]: I1202 07:48:08.213009 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d08915b6-6f79-40e4-8c26-d9f82606b4cc","Type":"ContainerStarted","Data":"3e2e331854c4e9337578d89d34663f953235d7bc3b4d554ef471426d1bd82237"} Dec 02 07:48:08 crc kubenswrapper[4895]: I1202 07:48:08.238165 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.238027072 podStartE2EDuration="2.238027072s" podCreationTimestamp="2025-12-02 07:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.233424009 +0000 UTC m=+1499.404283662" watchObservedRunningTime="2025-12-02 07:48:08.238027072 +0000 UTC m=+1499.408886695" Dec 02 07:48:09 crc kubenswrapper[4895]: I1202 07:48:09.545471 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:48:09 crc kubenswrapper[4895]: I1202 07:48:09.546054 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 07:48:10 crc kubenswrapper[4895]: I1202 07:48:10.832202 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:48:10 crc kubenswrapper[4895]: I1202 07:48:10.832303 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 07:48:11 crc kubenswrapper[4895]: I1202 07:48:11.619585 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 07:48:11 crc kubenswrapper[4895]: I1202 07:48:11.849002 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:11 crc kubenswrapper[4895]: I1202 07:48:11.849398 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:14 crc kubenswrapper[4895]: I1202 07:48:14.545485 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 07:48:14 crc kubenswrapper[4895]: I1202 07:48:14.546015 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 07:48:15 crc kubenswrapper[4895]: I1202 07:48:15.565090 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:15 crc kubenswrapper[4895]: I1202 07:48:15.565095 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:16 crc kubenswrapper[4895]: I1202 07:48:16.620026 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 07:48:16 crc kubenswrapper[4895]: I1202 07:48:16.716600 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 07:48:17 crc kubenswrapper[4895]: I1202 07:48:17.453797 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 07:48:20 crc kubenswrapper[4895]: I1202 07:48:20.839279 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 07:48:20 crc kubenswrapper[4895]: I1202 07:48:20.840388 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 07:48:20 crc kubenswrapper[4895]: I1202 07:48:20.840866 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 07:48:20 crc kubenswrapper[4895]: I1202 07:48:20.846410 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 07:48:21 crc kubenswrapper[4895]: I1202 07:48:21.438175 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 07:48:21 crc kubenswrapper[4895]: I1202 07:48:21.468464 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 07:48:21 crc kubenswrapper[4895]: I1202 07:48:21.483769 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 07:48:24 crc kubenswrapper[4895]: I1202 07:48:24.551782 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 07:48:24 crc kubenswrapper[4895]: I1202 07:48:24.552351 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 07:48:24 crc kubenswrapper[4895]: I1202 07:48:24.560996 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 07:48:24 crc kubenswrapper[4895]: I1202 07:48:24.578051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 07:48:35 crc kubenswrapper[4895]: I1202 07:48:35.473613 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:48:35 crc kubenswrapper[4895]: I1202 07:48:35.474544 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.841602 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.844726 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.862713 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.916288 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.916448 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:43 crc kubenswrapper[4895]: I1202 07:48:43.916486 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nmk\" (UniqueName: \"kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.018329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.018420 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nmk\" (UniqueName: \"kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.018489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.019132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.019181 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.109186 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nmk\" (UniqueName: \"kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk\") pod \"redhat-marketplace-vpwvp\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.179243 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.546856 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.682849 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.683555 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="5b52e937-5b7e-4179-9766-20a9c2f93e35" containerName="openstackclient" containerID="cri-o://12ca3ab2f0b64acec9c85ff2dcb3769838a447a3fd301eb6eff49f9f575c5ccb" gracePeriod=2 Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.728784 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.803467 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.804289 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="openstack-network-exporter" containerID="cri-o://f92e86e4ef56e11c6550ddfd03d9e3a46bb2f030d0256069562686b8ad550a7f" gracePeriod=300 Dec 02 07:48:44 crc kubenswrapper[4895]: E1202 07:48:44.870796 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:44 crc kubenswrapper[4895]: E1202 07:48:44.870856 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data podName:0d1cb194-5325-40c2-bbd4-0a48821e12aa nodeName:}" failed. No retries permitted until 2025-12-02 07:48:45.370841455 +0000 UTC m=+1536.541701068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data") pod "rabbitmq-cell1-server-0" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa") : configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.929241 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:44 crc kubenswrapper[4895]: E1202 07:48:44.929845 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b52e937-5b7e-4179-9766-20a9c2f93e35" containerName="openstackclient" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.929867 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b52e937-5b7e-4179-9766-20a9c2f93e35" containerName="openstackclient" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.930147 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b52e937-5b7e-4179-9766-20a9c2f93e35" containerName="openstackclient" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.941496 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.953809 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.976566 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:44 crc kubenswrapper[4895]: I1202 07:48:44.976701 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxmx\" (UniqueName: \"kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.080154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxmx\" (UniqueName: \"kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.080271 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.081628 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.172277 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxmx\" (UniqueName: \"kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx\") pod \"neutron5a3b-account-delete-949mv\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.200951 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.202578 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.279852 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9lrrh"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.298880 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.299109 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.300712 4895 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.308861 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9lrrh"] Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.312102 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config podName:8206622d-b224-4744-9358-ad7c10d98ca1 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:45.81206292 +0000 UTC m=+1536.982922533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config") pod "ovn-controller-metrics-qvrkm" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1") : configmap "ovncontroller-metrics-config" not found Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.333594 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.348104 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.392862 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="ovsdbserver-sb" containerID="cri-o://00f29c5ae0bb6e7bc18499f6d66bee4cc18c2d48981f4cf8697279c90c4396ff" gracePeriod=300 Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.401009 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.401158 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.402085 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.402162 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.402211 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data podName:0d1cb194-5325-40c2-bbd4-0a48821e12aa nodeName:}" failed. No retries permitted until 2025-12-02 07:48:46.402197082 +0000 UTC m=+1537.573056695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data") pod "rabbitmq-cell1-server-0" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa") : configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.411287 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.411647 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" containerID="cri-o://9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" gracePeriod=30 Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.412251 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="openstack-network-exporter" containerID="cri-o://8f7f80f7975fea79b1c3bcefa5a8a41052d690e193ab88673538d60ad2720b9a" gracePeriod=30 Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.452813 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5\") pod \"cinder7d85-account-delete-j8sgc\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.601569 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.675321 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.701559 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.701710 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.814992 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.820949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.821026 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.821208 4895 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.821249 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config podName:8206622d-b224-4744-9358-ad7c10d98ca1 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:46.82123387 +0000 UTC m=+1537.992093483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config") pod "ovn-controller-metrics-qvrkm" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1") : configmap "ovncontroller-metrics-config" not found Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.879195 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance5583-account-delete-xm6hg"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.882407 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.890415 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerStarted","Data":"63e3069f74e0c2f5114d28209eb436a5390bf351b874275c933348a22af2709a"} Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.923628 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-44vd8"] Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.924601 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.924854 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.926400 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 02 07:48:45 crc kubenswrapper[4895]: E1202 07:48:45.926468 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data podName:ca98cba7-4127-4d25-a139-1a42224331f2 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:46.426451298 +0000 UTC m=+1537.597310911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data") pod "rabbitmq-server-0" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2") : configmap "rabbitmq-config-data" not found Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.934690 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.942866 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9a3bcb64-db25-4f04-8624-af10542e9f10/ovsdbserver-sb/0.log" Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.942959 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerID="f92e86e4ef56e11c6550ddfd03d9e3a46bb2f030d0256069562686b8ad550a7f" exitCode=2 Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.943487 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerDied","Data":"f92e86e4ef56e11c6550ddfd03d9e3a46bb2f030d0256069562686b8ad550a7f"} Dec 02 07:48:45 crc kubenswrapper[4895]: I1202 07:48:45.982910 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.031412 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.031807 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="dnsmasq-dns" containerID="cri-o://9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194" gracePeriod=10 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.082551 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-44vd8"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.085844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65\") pod \"placementa8bc-account-delete-jz5nc\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.138472 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.138578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjnqw\" (UniqueName: \"kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.139688 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance5583-account-delete-xm6hg"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.190798 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.222890 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.243195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjnqw\" (UniqueName: \"kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.243374 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.262871 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.299126 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.311888 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.322353 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.322433 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.329216 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.330775 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.366980 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.387263 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjnqw\" (UniqueName: \"kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw\") pod \"glance5583-account-delete-xm6hg\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.406055 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-prmpk"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.425138 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-prmpk"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.457378 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2fm8\" (UniqueName: \"kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.457557 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.457703 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.457787 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data podName:0d1cb194-5325-40c2-bbd4-0a48821e12aa nodeName:}" failed. No retries permitted until 2025-12-02 07:48:48.457768154 +0000 UTC m=+1539.628627767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data") pod "rabbitmq-cell1-server-0" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa") : configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.458049 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.458139 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data podName:ca98cba7-4127-4d25-a139-1a42224331f2 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:47.458117214 +0000 UTC m=+1538.628976827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data") pod "rabbitmq-server-0" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2") : configmap "rabbitmq-config-data" not found Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.514561 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vjgr8"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.530914 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.531429 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-qvrkm" podUID="8206622d-b224-4744-9358-ad7c10d98ca1" containerName="openstack-network-exporter" containerID="cri-o://06ff8339033df96a83077a62e929bc5d1df2839a17c03cd0e1008d9758abd8ec" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.544261 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.549902 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.550293 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ddf8948cc-h2bbh" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-api" containerID="cri-o://44ae8909515453d51c81fc2eab9723fc18e5cf8dc79ec16427db8d716e2d75dd" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.550488 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-ddf8948cc-h2bbh" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-httpd" containerID="cri-o://949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.562031 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.563143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2fm8\" (UniqueName: \"kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.564569 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.571173 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vjgr8"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.591438 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xdfqx"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.595635 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.602068 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.602779 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="openstack-network-exporter" containerID="cri-o://702d499c8eb77b3784f109189f3605813a2864341fb22907bb8c80622cf297f0" gracePeriod=300 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.608078 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2fm8\" (UniqueName: \"kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8\") pod \"barbican4aa4-account-delete-pvmbl\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.639650 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-74rn2"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.651418 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xdfqx"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.679886 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-74rn2"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.697105 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.698881 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-server" containerID="cri-o://8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699046 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-updater" containerID="cri-o://888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699074 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-server" containerID="cri-o://d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699099 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-auditor" containerID="cri-o://e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699140 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-replicator" containerID="cri-o://089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699179 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-server" containerID="cri-o://fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699249 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-reaper" containerID="cri-o://f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699299 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-auditor" containerID="cri-o://0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699352 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-replicator" containerID="cri-o://0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699400 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-expirer" containerID="cri-o://a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699482 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-auditor" containerID="cri-o://79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699525 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-updater" containerID="cri-o://81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699629 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-replicator" containerID="cri-o://d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699633 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="swift-recon-cron" containerID="cri-o://7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.699707 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="rsync" containerID="cri-o://5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.720546 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.720964 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="cinder-scheduler" containerID="cri-o://bd9e831f88d074ed4ebcb3f0c21947564533211ce824af698b616217e7b83e86" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.721145 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="probe" containerID="cri-o://8a861d47b18ce485f266fa0a57adf3455c385cacb617b37e3b45a4bd17799c71" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.724940 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="ovsdbserver-nb" containerID="cri-o://37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e" gracePeriod=300 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.750656 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.752445 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.880195 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb4hb\" (UniqueName: \"kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.880342 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.880560 4895 configmap.go:193] Couldn't get configMap openstack/ovncontroller-metrics-config: configmap "ovncontroller-metrics-config" not found Dec 02 07:48:46 crc kubenswrapper[4895]: E1202 07:48:46.880612 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config podName:8206622d-b224-4744-9358-ad7c10d98ca1 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:48.88059572 +0000 UTC m=+1540.051455333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config") pod "ovn-controller-metrics-qvrkm" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1") : configmap "ovncontroller-metrics-config" not found Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.895128 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.946638 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api-log" containerID="cri-o://e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.948807 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api" containerID="cri-o://3a0d36cdfb3f77e74dda0c49d0558e6c7571700d4bfd6cdaa1acbb5f35e6a972" gracePeriod=30 Dec 02 07:48:46 crc kubenswrapper[4895]: I1202 07:48:46.975668 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.004411 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tx8bx"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.012267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.012855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb4hb\" (UniqueName: \"kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.038997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.046207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb4hb\" (UniqueName: \"kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb\") pod \"novaapi23cb-account-delete-g8msv\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.095309 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tx8bx"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.352108 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07cedd96-e60e-40e6-9ae7-c29728b9e62c" path="/var/lib/kubelet/pods/07cedd96-e60e-40e6-9ae7-c29728b9e62c/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.354098 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3db479-516c-46b3-881d-7021a15c7a7d" path="/var/lib/kubelet/pods/4f3db479-516c-46b3-881d-7021a15c7a7d/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389822 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389860 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389869 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389878 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389886 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389900 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389909 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389916 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389923 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.389930 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.405855 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qvrkm_8206622d-b224-4744-9358-ad7c10d98ca1/openstack-network-exporter/0.log" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.405901 4895 generic.go:334] "Generic (PLEG): container finished" podID="8206622d-b224-4744-9358-ad7c10d98ca1" containerID="06ff8339033df96a83077a62e929bc5d1df2839a17c03cd0e1008d9758abd8ec" exitCode=2 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.433035 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56723c9c-15bf-4eaa-896c-ea5d07066b27" path="/var/lib/kubelet/pods/56723c9c-15bf-4eaa-896c-ea5d07066b27/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.434668 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d3d0dfa-b0dd-4b27-8751-3483a85dc490" path="/var/lib/kubelet/pods/6d3d0dfa-b0dd-4b27-8751-3483a85dc490/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.435336 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ece5f3-3dc5-41db-a8e9-37e6f9054dd8" path="/var/lib/kubelet/pods/96ece5f3-3dc5-41db-a8e9-37e6f9054dd8/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.436119 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a756fe09-2c73-430d-be27-34caa885311c" path="/var/lib/kubelet/pods/a756fe09-2c73-430d-be27-34caa885311c/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.456213 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa608fc-52f9-426b-aca3-610fe5e245e0" path="/var/lib/kubelet/pods/caa608fc-52f9-426b-aca3-610fe5e245e0/volumes" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457339 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9wgm"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457374 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457403 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457422 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9wgm"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457440 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457451 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457469 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457489 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457499 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457510 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457524 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457537 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457549 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qvrkm" event={"ID":"8206622d-b224-4744-9358-ad7c10d98ca1","Type":"ContainerDied","Data":"06ff8339033df96a83077a62e929bc5d1df2839a17c03cd0e1008d9758abd8ec"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.457877 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54957dcd96-7sx87" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-log" containerID="cri-o://4bce6feae18b88a0dade864ed7f4db319704698221a61e3defcf26b5f9e0a73e" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: E1202 07:48:47.458423 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5f33d2_0416_40b5_8133_324aa1a60118.slice/crio-9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-conmon-0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93ac7640_b11c_48f4_b537_45bebe4af01b.slice/crio-37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebbed0ba_1d44_4421_a276_b075b0f31c3f.slice/crio-e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-conmon-79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5ec753_410a_4d4b_8071_ce60970ba4df.slice/crio-949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-conmon-f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-conmon-089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5ec753_410a_4d4b_8071_ce60970ba4df.slice/crio-conmon-949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-conmon-a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b52e937_5b7e_4179_9766_20a9c2f93e35.slice/crio-conmon-12ca3ab2f0b64acec9c85ff2dcb3769838a447a3fd301eb6eff49f9f575c5ccb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b8ece5_4192_4e13_a1c7_86ed3c627ddf.slice/crio-f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc.scope\": RecentStats: unable to find data in memory cache]" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.458673 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54957dcd96-7sx87" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-api" containerID="cri-o://79507980e01b07ea773d434932da83cc407f386cc2f4f05c605e4f8341d7bef2" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.502925 4895 generic.go:334] "Generic (PLEG): container finished" podID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerID="ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.503398 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerDied","Data":"ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.528201 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93ac7640-b11c-48f4-b537-45bebe4af01b/ovsdbserver-nb/0.log" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.528253 4895 generic.go:334] "Generic (PLEG): container finished" podID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerID="702d499c8eb77b3784f109189f3605813a2864341fb22907bb8c80622cf297f0" exitCode=2 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.528286 4895 generic.go:334] "Generic (PLEG): container finished" podID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerID="37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e" exitCode=143 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.528798 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerDied","Data":"702d499c8eb77b3784f109189f3605813a2864341fb22907bb8c80622cf297f0"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.528847 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerDied","Data":"37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.534553 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.535220 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-log" containerID="cri-o://4362e47d57c98a2bd4e29c4d3aa4369c5e901649c4440c8bab3af675617ff778" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.535373 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-httpd" containerID="cri-o://2404d0d162ba97497121e295a4d0041b66d86ff11fa14a769019cf11872671c2" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.567578 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9a3bcb64-db25-4f04-8624-af10542e9f10/ovsdbserver-sb/0.log" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.567641 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerID="00f29c5ae0bb6e7bc18499f6d66bee4cc18c2d48981f4cf8697279c90c4396ff" exitCode=143 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.567766 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerDied","Data":"00f29c5ae0bb6e7bc18499f6d66bee4cc18c2d48981f4cf8697279c90c4396ff"} Dec 02 07:48:47 crc kubenswrapper[4895]: E1202 07:48:47.567915 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 02 07:48:47 crc kubenswrapper[4895]: E1202 07:48:47.567979 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data podName:ca98cba7-4127-4d25-a139-1a42224331f2 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:49.567962447 +0000 UTC m=+1540.738822060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data") pod "rabbitmq-server-0" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2") : configmap "rabbitmq-config-data" not found Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.602532 4895 generic.go:334] "Generic (PLEG): container finished" podID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerID="8f7f80f7975fea79b1c3bcefa5a8a41052d690e193ab88673538d60ad2720b9a" exitCode=2 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.602951 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerDied","Data":"8f7f80f7975fea79b1c3bcefa5a8a41052d690e193ab88673538d60ad2720b9a"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.608065 4895 generic.go:334] "Generic (PLEG): container finished" podID="5b52e937-5b7e-4179-9766-20a9c2f93e35" containerID="12ca3ab2f0b64acec9c85ff2dcb3769838a447a3fd301eb6eff49f9f575c5ccb" exitCode=137 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.632004 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.632370 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-log" containerID="cri-o://fd8c7d4e19097367de3d3f49094033e0adeb083a5427064f86bcdaba564bc61c" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.633190 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-httpd" containerID="cri-o://973ab025884cab7054f146e0f744a06e1f4e800f6c16521085496ffc96503509" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.679852 4895 generic.go:334] "Generic (PLEG): container finished" podID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerID="9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194" exitCode=0 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.679924 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" event={"ID":"9c5f33d2-0416-40b5-8133-324aa1a60118","Type":"ContainerDied","Data":"9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194"} Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.726546 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.784148 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.803112 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.813701 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.818840 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.835731 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.852998 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.853332 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-log" containerID="cri-o://fc3f6c60c5579c4ea14cacedf2c65b1a8d013562a1bf58b34c7f59f9c0b1bdbe" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.854814 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-api" containerID="cri-o://e21126490e30d0f2abdaa9c6468d800825eee8d11c80c4baee6ce5e501917408" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.869826 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.881272 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.881620 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener-log" containerID="cri-o://1ea05e687809a1075b370d099e40ef305622b4839ae26a0439d53df787025e36" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.882204 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener" containerID="cri-o://87a341d01cbe5679c7f66108701ad133b21f9226861ceb315e694aa0b420673a" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.885936 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.886018 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mwj\" (UniqueName: \"kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.891052 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9a3bcb64-db25-4f04-8624-af10542e9f10/ovsdbserver-sb/0.log" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.891146 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.898345 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.905544 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="rabbitmq" containerID="cri-o://825f000e90e467b37377e382a45ce9ec58ad6ced7e5a761f9a5ac0cc1b0ded3d" gracePeriod=604800 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.917993 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.918242 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-788d454954-brr26" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api-log" containerID="cri-o://915e5d2b5f5c95e83c1104dc0136dd664c02203a632c18741317fd352c1f6413" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.918358 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-788d454954-brr26" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api" containerID="cri-o://8f6e08f059d8d10b34bda28a99cf993bc10f7153af260e881771ef9437a89f77" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.929080 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qvrkm_8206622d-b224-4744-9358-ad7c10d98ca1/openstack-network-exporter/0.log" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.929143 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.943861 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.944198 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" containerID="cri-o://7a9cd5ea2cd01d61f6bb76eff238c970ae03c6d9c57bfc437465a95ac614529c" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.944398 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" containerID="cri-o://1581cb4c4b70dcc4008550020a88177eb72fd5b2057dc2f0204082b9090480c2" gracePeriod=30 Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987604 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987718 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987788 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987824 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcxlq\" (UniqueName: \"kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987859 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.987950 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79dp7\" (UniqueName: \"kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988027 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988060 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvjf\" (UniqueName: \"kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988087 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988113 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988141 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988225 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988269 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle\") pod \"8206622d-b224-4744-9358-ad7c10d98ca1\" (UID: \"8206622d-b224-4744-9358-ad7c10d98ca1\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988368 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988398 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988434 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb\") pod \"9c5f33d2-0416-40b5-8133-324aa1a60118\" (UID: \"9c5f33d2-0416-40b5-8133-324aa1a60118\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988466 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988493 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir\") pod \"9a3bcb64-db25-4f04-8624-af10542e9f10\" (UID: \"9a3bcb64-db25-4f04-8624-af10542e9f10\") " Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.988877 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.989009 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mwj\" (UniqueName: \"kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:47 crc kubenswrapper[4895]: I1202 07:48:47.998150 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.006695 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config" (OuterVolumeSpecName: "config") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.019254 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.039349 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config" (OuterVolumeSpecName: "config") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.059851 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts" (OuterVolumeSpecName: "scripts") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.039390 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.060961 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b969f4967-hmqp8" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker-log" containerID="cri-o://68d4a2538c6c04477ff11aefd007fcd9450afdb38ccbda6a64db4e5f865071b1" gracePeriod=30 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.040370 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.061697 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b969f4967-hmqp8" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker" containerID="cri-o://7d6a5cbf4ac42d7b9bcb1f16b7d852ad3e604b72c9dfa43a52ca193e0d0f7f4e" gracePeriod=30 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.062027 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.086186 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7" (OuterVolumeSpecName: "kube-api-access-79dp7") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "kube-api-access-79dp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.091621 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.142987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.143798 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mwj\" (UniqueName: \"kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj\") pod \"novacell0b7d1-account-delete-wchwk\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145083 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145235 4895 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145247 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3bcb64-db25-4f04-8624-af10542e9f10-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145261 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8206622d-b224-4744-9358-ad7c10d98ca1-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145272 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8206622d-b224-4744-9358-ad7c10d98ca1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.145288 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79dp7\" (UniqueName: \"kubernetes.io/projected/9a3bcb64-db25-4f04-8624-af10542e9f10-kube-api-access-79dp7\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.149020 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq" (OuterVolumeSpecName: "kube-api-access-dcxlq") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "kube-api-access-dcxlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.156954 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.164394 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.165172 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf" (OuterVolumeSpecName: "kube-api-access-fcvjf") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "kube-api-access-fcvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.212844 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qhr8x"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.231826 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.232463 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" containerID="cri-o://45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" gracePeriod=30 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.249942 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qhr8x"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.253502 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.253559 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.253573 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcxlq\" (UniqueName: \"kubernetes.io/projected/8206622d-b224-4744-9358-ad7c10d98ca1-kube-api-access-dcxlq\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.253589 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcvjf\" (UniqueName: \"kubernetes.io/projected/9c5f33d2-0416-40b5-8133-324aa1a60118-kube-api-access-fcvjf\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.255705 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.255820 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8svml"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.265907 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8svml"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.296824 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f3a9-account-create-update-bfmns"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.301779 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f3a9-account-create-update-bfmns"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.309287 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.310534 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chxlc"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.356041 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.361409 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.361724 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" gracePeriod=30 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.415900 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="rabbitmq" containerID="cri-o://5d044ff799057808b8d67f79590923f9bd83b515bcd050be4b95fa7aeeb31f38" gracePeriod=604800 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.456464 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" containerID="cri-o://7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" gracePeriod=28 Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.458971 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.459041 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data podName:0d1cb194-5325-40c2-bbd4-0a48821e12aa nodeName:}" failed. No retries permitted until 2025-12-02 07:48:52.459023104 +0000 UTC m=+1543.629882717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data") pod "rabbitmq-cell1-server-0" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa") : configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.505825 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.506191 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://21f6d09bc2b80b8035a54dfa404bb01cbc6de2843d53dca435681f4b45dafd2f" gracePeriod=30 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.570905 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-chxlc"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.581384 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.681245 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.700062 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.193:6080/vnc_lite.html\": read tcp 10.217.0.2:39160->10.217.0.193:6080: read: connection reset by peer" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.700532 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/nova-cell1-novncproxy-0" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.193:6080/vnc_lite.html\": read tcp 10.217.0.2:39148->10.217.0.193:6080: read: connection reset by peer" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.716932 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.718276 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718293 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.718323 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="init" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718332 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="init" Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.718354 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="dnsmasq-dns" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718361 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="dnsmasq-dns" Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.718383 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8206622d-b224-4744-9358-ad7c10d98ca1" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718390 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="8206622d-b224-4744-9358-ad7c10d98ca1" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.718425 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="ovsdbserver-sb" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718431 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="ovsdbserver-sb" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718810 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718828 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" containerName="dnsmasq-dns" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718840 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="8206622d-b224-4744-9358-ad7c10d98ca1" containerName="openstack-network-exporter" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.718860 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" containerName="ovsdbserver-sb" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.739627 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.804050 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.846972 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqwh\" (UniqueName: \"kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.851617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.859246 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.935145 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.959400 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.960030 4895 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 02 07:48:48 crc kubenswrapper[4895]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 02 07:48:48 crc kubenswrapper[4895]: + source /usr/local/bin/container-scripts/functions Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNBridge=br-int Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNRemote=tcp:localhost:6642 Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNEncapType=geneve Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNAvailabilityZones= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ EnableChassisAsGateway=true Dec 02 07:48:48 crc kubenswrapper[4895]: ++ PhysicalNetworks= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNHostName= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 02 07:48:48 crc kubenswrapper[4895]: ++ ovs_dir=/var/lib/openvswitch Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 02 07:48:48 crc kubenswrapper[4895]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + cleanup_ovsdb_server_semaphore Dec 02 07:48:48 crc kubenswrapper[4895]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 02 07:48:48 crc kubenswrapper[4895]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-9vczq" message=< Dec 02 07:48:48 crc kubenswrapper[4895]: Exiting ovsdb-server (5) [ OK ] Dec 02 07:48:48 crc kubenswrapper[4895]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 02 07:48:48 crc kubenswrapper[4895]: + source /usr/local/bin/container-scripts/functions Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNBridge=br-int Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNRemote=tcp:localhost:6642 Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNEncapType=geneve Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNAvailabilityZones= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ EnableChassisAsGateway=true Dec 02 07:48:48 crc kubenswrapper[4895]: ++ PhysicalNetworks= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNHostName= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 02 07:48:48 crc kubenswrapper[4895]: ++ ovs_dir=/var/lib/openvswitch Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 02 07:48:48 crc kubenswrapper[4895]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + cleanup_ovsdb_server_semaphore Dec 02 07:48:48 crc kubenswrapper[4895]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 02 07:48:48 crc kubenswrapper[4895]: > Dec 02 07:48:48 crc kubenswrapper[4895]: E1202 07:48:48.961913 4895 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 02 07:48:48 crc kubenswrapper[4895]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 02 07:48:48 crc kubenswrapper[4895]: + source /usr/local/bin/container-scripts/functions Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNBridge=br-int Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNRemote=tcp:localhost:6642 Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNEncapType=geneve Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNAvailabilityZones= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ EnableChassisAsGateway=true Dec 02 07:48:48 crc kubenswrapper[4895]: ++ PhysicalNetworks= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ OVNHostName= Dec 02 07:48:48 crc kubenswrapper[4895]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 02 07:48:48 crc kubenswrapper[4895]: ++ ovs_dir=/var/lib/openvswitch Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 02 07:48:48 crc kubenswrapper[4895]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 02 07:48:48 crc kubenswrapper[4895]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + sleep 0.5 Dec 02 07:48:48 crc kubenswrapper[4895]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 02 07:48:48 crc kubenswrapper[4895]: + cleanup_ovsdb_server_semaphore Dec 02 07:48:48 crc kubenswrapper[4895]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 02 07:48:48 crc kubenswrapper[4895]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 02 07:48:48 crc kubenswrapper[4895]: > pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" containerID="cri-o://6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.962617 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" containerID="cri-o://6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" gracePeriod=28 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.960676 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.969983 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config" (OuterVolumeSpecName: "config") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.975286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.975941 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.976516 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqwh\" (UniqueName: \"kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.976887 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.977531 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.979208 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.979282 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.979361 4895 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.979798 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.997508 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerID="7a9cd5ea2cd01d61f6bb76eff238c970ae03c6d9c57bfc437465a95ac614529c" exitCode=143 Dec 02 07:48:48 crc kubenswrapper[4895]: I1202 07:48:48.998018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerDied","Data":"7a9cd5ea2cd01d61f6bb76eff238c970ae03c6d9c57bfc437465a95ac614529c"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.010015 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.013223 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqwh\" (UniqueName: \"kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh\") pod \"certified-operators-wjx7g\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.014071 4895 generic.go:334] "Generic (PLEG): container finished" podID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerID="68d4a2538c6c04477ff11aefd007fcd9450afdb38ccbda6a64db4e5f865071b1" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.014289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerDied","Data":"68d4a2538c6c04477ff11aefd007fcd9450afdb38ccbda6a64db4e5f865071b1"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.032353 4895 generic.go:334] "Generic (PLEG): container finished" podID="e4869eb0-5e33-4837-8295-06ca17076e69" containerID="4362e47d57c98a2bd4e29c4d3aa4369c5e901649c4440c8bab3af675617ff778" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.033103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerDied","Data":"4362e47d57c98a2bd4e29c4d3aa4369c5e901649c4440c8bab3af675617ff778"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.036045 4895 generic.go:334] "Generic (PLEG): container finished" podID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerID="fc3f6c60c5579c4ea14cacedf2c65b1a8d013562a1bf58b34c7f59f9c0b1bdbe" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.036240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerDied","Data":"fc3f6c60c5579c4ea14cacedf2c65b1a8d013562a1bf58b34c7f59f9c0b1bdbe"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.038223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" event={"ID":"9c5f33d2-0416-40b5-8133-324aa1a60118","Type":"ContainerDied","Data":"fc7eddce0ba5ae5098676a1f5c6537ff965b82512c0225e16f0345065f6c2e87"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.038394 4895 scope.go:117] "RemoveContainer" containerID="9119b66d955826b5b5f6ec45ebae984f251b2adfcdada61729fc1808ffec5194" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.038710 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-zx9lx" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.091952 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.113354 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qvrkm_8206622d-b224-4744-9358-ad7c10d98ca1/openstack-network-exporter/0.log" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.113988 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qvrkm" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.115071 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qvrkm" event={"ID":"8206622d-b224-4744-9358-ad7c10d98ca1","Type":"ContainerDied","Data":"fa59c78f12ab59582ba18ac98344d7edfd8ba52bfd4fc14f8ee735e6d81c9907"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.124996 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.127196 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93ac7640-b11c-48f4-b537-45bebe4af01b/ovsdbserver-nb/0.log" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.127293 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.130913 4895 generic.go:334] "Generic (PLEG): container finished" podID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerID="e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.130978 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerDied","Data":"e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.136890 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9a3bcb64-db25-4f04-8624-af10542e9f10" (UID: "9a3bcb64-db25-4f04-8624-af10542e9f10"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.145005 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c5f33d2-0416-40b5-8133-324aa1a60118" (UID: "9c5f33d2-0416-40b5-8133-324aa1a60118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.163990 4895 scope.go:117] "RemoveContainer" containerID="5d46ad9baa196f7d343ee2c2039ebc3c7e53ef6e2da357137a9c5ae843ce8ef6" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.189109 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.202133 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.202904 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393335e8-25d7-4364-ae52-eab9ac0d3fa0" path="/var/lib/kubelet/pods/393335e8-25d7-4364-ae52-eab9ac0d3fa0/volumes" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.203471 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5d3f70-f931-473a-af3c-e0858a46e311" path="/var/lib/kubelet/pods/4e5d3f70-f931-473a-af3c-e0858a46e311/volumes" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.204073 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845f4c45-cc2f-4b99-a42f-3c04b18730fe" path="/var/lib/kubelet/pods/845f4c45-cc2f-4b99-a42f-3c04b18730fe/volumes" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.204602 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0df612e-785b-404d-b9ef-21c1ee57b14a" path="/var/lib/kubelet/pods/e0df612e-785b-404d-b9ef-21c1ee57b14a/volumes" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.205959 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ad0597-da06-43ca-bcbb-03eb78fb8b53" path="/var/lib/kubelet/pods/f0ad0597-da06-43ca-bcbb-03eb78fb8b53/volumes" Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.211959 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.212285 4895 generic.go:334] "Generic (PLEG): container finished" podID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerID="1ea05e687809a1075b370d099e40ef305622b4839ae26a0439d53df787025e36" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.213327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret\") pod \"5b52e937-5b7e-4179-9766-20a9c2f93e35\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.213458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xw5\" (UniqueName: \"kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5\") pod \"5b52e937-5b7e-4179-9766-20a9c2f93e35\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.213532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config\") pod \"5b52e937-5b7e-4179-9766-20a9c2f93e35\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.213599 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle\") pod \"5b52e937-5b7e-4179-9766-20a9c2f93e35\" (UID: \"5b52e937-5b7e-4179-9766-20a9c2f93e35\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.214200 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.214218 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c5f33d2-0416-40b5-8133-324aa1a60118-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.214228 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a3bcb64-db25-4f04-8624-af10542e9f10-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.222842 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.222973 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.237732 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5" (OuterVolumeSpecName: "kube-api-access-p8xw5") pod "5b52e937-5b7e-4179-9766-20a9c2f93e35" (UID: "5b52e937-5b7e-4179-9766-20a9c2f93e35"). InnerVolumeSpecName "kube-api-access-p8xw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.247262 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8206622d-b224-4744-9358-ad7c10d98ca1" (UID: "8206622d-b224-4744-9358-ad7c10d98ca1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.256494 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="galera" containerID="cri-o://8a2f68e796838e87e45698b40183c455794d730caf5af19a07c35fd150b09fe1" gracePeriod=29 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317244 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317390 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9v7k\" (UniqueName: \"kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317446 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317496 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317514 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317538 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.317689 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"93ac7640-b11c-48f4-b537-45bebe4af01b\" (UID: \"93ac7640-b11c-48f4-b537-45bebe4af01b\") " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.318245 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8206622d-b224-4744-9358-ad7c10d98ca1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.318257 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8xw5\" (UniqueName: \"kubernetes.io/projected/5b52e937-5b7e-4179-9766-20a9c2f93e35-kube-api-access-p8xw5\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.320566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config" (OuterVolumeSpecName: "config") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.321064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts" (OuterVolumeSpecName: "scripts") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.333932 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.340500 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.342506 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4" exitCode=0 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.342551 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b" exitCode=0 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.342562 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962" exitCode=0 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.342574 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4" exitCode=0 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.348211 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k" (OuterVolumeSpecName: "kube-api-access-k9v7k") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "kube-api-access-k9v7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.350655 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5b52e937-5b7e-4179-9766-20a9c2f93e35" (UID: "5b52e937-5b7e-4179-9766-20a9c2f93e35"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.352825 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerID="915e5d2b5f5c95e83c1104dc0136dd664c02203a632c18741317fd352c1f6413" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.372286 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b52e937-5b7e-4179-9766-20a9c2f93e35" (UID: "5b52e937-5b7e-4179-9766-20a9c2f93e35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.373788 4895 generic.go:334] "Generic (PLEG): container finished" podID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerID="4bce6feae18b88a0dade864ed7f4db319704698221a61e3defcf26b5f9e0a73e" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.379893 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9a3bcb64-db25-4f04-8624-af10542e9f10/ovsdbserver-sb/0.log" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.380440 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.403162 4895 generic.go:334] "Generic (PLEG): container finished" podID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerID="949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324" exitCode=0 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.411342 4895 generic.go:334] "Generic (PLEG): container finished" podID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerID="fd8c7d4e19097367de3d3f49094033e0adeb083a5427064f86bcdaba564bc61c" exitCode=143 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430216 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430258 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430273 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430294 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430307 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93ac7640-b11c-48f4-b537-45bebe4af01b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430322 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9v7k\" (UniqueName: \"kubernetes.io/projected/93ac7640-b11c-48f4-b537-45bebe4af01b-kube-api-access-k9v7k\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.430333 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.443725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5b52e937-5b7e-4179-9766-20a9c2f93e35" (UID: "5b52e937-5b7e-4179-9766-20a9c2f93e35"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.470266 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.516527 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.549817 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.549863 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.549878 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b52e937-5b7e-4179-9766-20a9c2f93e35-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.581990 4895 scope.go:117] "RemoveContainer" containerID="06ff8339033df96a83077a62e929bc5d1df2839a17c03cd0e1008d9758abd8ec" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599289 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerDied","Data":"1ea05e687809a1075b370d099e40ef305622b4839ae26a0439d53df787025e36"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599357 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599372 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599387 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599402 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerDied","Data":"915e5d2b5f5c95e83c1104dc0136dd664c02203a632c18741317fd352c1f6413"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599426 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerDied","Data":"4bce6feae18b88a0dade864ed7f4db319704698221a61e3defcf26b5f9e0a73e"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599459 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9a3bcb64-db25-4f04-8624-af10542e9f10","Type":"ContainerDied","Data":"bff0285bdb4d1c719faa987576935a0217bbc0dd92601edd9e2e954e3b65b35b"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599472 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerDied","Data":"949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerDied","Data":"fd8c7d4e19097367de3d3f49094033e0adeb083a5427064f86bcdaba564bc61c"} Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599790 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.599730 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerName="nova-scheduler-scheduler" containerID="cri-o://cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" gracePeriod=30 Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.646447 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "93ac7640-b11c-48f4-b537-45bebe4af01b" (UID: "93ac7640-b11c-48f4-b537-45bebe4af01b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.652919 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.652955 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93ac7640-b11c-48f4-b537-45bebe4af01b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.653266 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.653340 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data podName:ca98cba7-4127-4d25-a139-1a42224331f2 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:53.653316883 +0000 UTC m=+1544.824176506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data") pod "rabbitmq-server-0" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2") : configmap "rabbitmq-config-data" not found Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.670550 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance5583-account-delete-xm6hg"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.794387 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.803754 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.831028 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.853603 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.862464 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-zx9lx"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.870109 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.871756 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.872323 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.872400 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.874286 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.876396 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.881123 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.883530 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:49 crc kubenswrapper[4895]: E1202 07:48:49.883580 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.886040 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.894511 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.901825 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-qvrkm"] Dec 02 07:48:49 crc kubenswrapper[4895]: I1202 07:48:49.926246 4895 scope.go:117] "RemoveContainer" containerID="12ca3ab2f0b64acec9c85ff2dcb3769838a447a3fd301eb6eff49f9f575c5ccb" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.057594 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.060373 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6f6974886f-mmsbz" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-server" containerID="cri-o://c0d40bd925f15211d99af8cacd53d2e85f799a87ce053777a156c72dcd0fd1bc" gracePeriod=30 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.063877 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6f6974886f-mmsbz" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-httpd" containerID="cri-o://8905a04cd6af553d962d3110181cd121c314f07c69d9726566c2e6fedcbedc7d" gracePeriod=30 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.077929 4895 scope.go:117] "RemoveContainer" containerID="f92e86e4ef56e11c6550ddfd03d9e3a46bb2f030d0256069562686b8ad550a7f" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.207014 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.393860 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6f6974886f-mmsbz" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": dial tcp 10.217.0.165:8080: connect: connection refused" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.396000 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6f6974886f-mmsbz" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.165:8080/healthcheck\": dial tcp 10.217.0.165:8080: connect: connection refused" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.444090 4895 scope.go:117] "RemoveContainer" containerID="00f29c5ae0bb6e7bc18499f6d66bee4cc18c2d48981f4cf8697279c90c4396ff" Dec 02 07:48:50 crc kubenswrapper[4895]: W1202 07:48:50.475049 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd42411e0_2228_4a1a_9d31_e3788f2b1f0c.slice/crio-09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf WatchSource:0}: Error finding container 09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf: Status 404 returned error can't find the container with id 09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.533055 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="galera" probeResult="failure" output="" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.542897 4895 generic.go:334] "Generic (PLEG): container finished" podID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerID="7d6a5cbf4ac42d7b9bcb1f16b7d852ad3e604b72c9dfa43a52ca193e0d0f7f4e" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.542984 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerDied","Data":"7d6a5cbf4ac42d7b9bcb1f16b7d852ad3e604b72c9dfa43a52ca193e0d0f7f4e"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.546564 4895 generic.go:334] "Generic (PLEG): container finished" podID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerID="420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.546657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerDied","Data":"420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.550170 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementa8bc-account-delete-jz5nc" event={"ID":"7067a12f-0245-45f5-a806-591d5999c7f0","Type":"ContainerStarted","Data":"fb26a44230d333a13630d130d653b763d7dcb89f113e8876425cd193ebb107d4"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.587188 4895 generic.go:334] "Generic (PLEG): container finished" podID="85e9e481-0762-42a8-a25a-7d50500f1236" containerID="8905a04cd6af553d962d3110181cd121c314f07c69d9726566c2e6fedcbedc7d" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.587327 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerDied","Data":"8905a04cd6af553d962d3110181cd121c314f07c69d9726566c2e6fedcbedc7d"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.600877 4895 generic.go:334] "Generic (PLEG): container finished" podID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerID="87a341d01cbe5679c7f66108701ad133b21f9226861ceb315e694aa0b420673a" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.601023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerDied","Data":"87a341d01cbe5679c7f66108701ad133b21f9226861ceb315e694aa0b420673a"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.610312 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7d85-account-delete-j8sgc" event={"ID":"f28e5fd3-456b-4960-a3a9-1134e3eecb1f","Type":"ContainerStarted","Data":"3cbef3cc9d8370b2c106b6a1f67d7cde1be7e59e73377a5eabb4fc8c2688067a"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.616353 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": read tcp 10.217.0.2:39590->10.217.0.162:8776: read: connection reset by peer" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.647132 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_93ac7640-b11c-48f4-b537-45bebe4af01b/ovsdbserver-nb/0.log" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.647223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"93ac7640-b11c-48f4-b537-45bebe4af01b","Type":"ContainerDied","Data":"9898ced809f4f02ded24f28135de90d0170c28170f7395759aab814117ee8368"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.647345 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.733980 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.734334 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5583-account-delete-xm6hg" event={"ID":"6b3c2445-8bce-4d09-ad86-02c1ba6495fb","Type":"ContainerStarted","Data":"f9caf1a101a5817e21ca4e677736eb4472cb7fca73f00ec0f90330253ace0248"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.734374 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5583-account-delete-xm6hg" event={"ID":"6b3c2445-8bce-4d09-ad86-02c1ba6495fb","Type":"ContainerStarted","Data":"446dfc1cb253a69c411beac7da8e5a22f44d1cea25cb336c6af447ebccd97a50"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.781900 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.795447 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.795520 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerDied","Data":"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.802165 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": read tcp 10.217.0.2:34752->10.217.0.167:9292: read: connection reset by peer" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.802346 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": read tcp 10.217.0.2:34766->10.217.0.167:9292: read: connection reset by peer" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.808888 4895 generic.go:334] "Generic (PLEG): container finished" podID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerID="8a861d47b18ce485f266fa0a57adf3455c385cacb617b37e3b45a4bd17799c71" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.808974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerDied","Data":"8a861d47b18ce485f266fa0a57adf3455c385cacb617b37e3b45a4bd17799c71"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.814095 4895 generic.go:334] "Generic (PLEG): container finished" podID="183c5216-30f9-4f75-865b-7f795ea149fb" containerID="21f6d09bc2b80b8035a54dfa404bb01cbc6de2843d53dca435681f4b45dafd2f" exitCode=0 Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.814193 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183c5216-30f9-4f75-865b-7f795ea149fb","Type":"ContainerDied","Data":"21f6d09bc2b80b8035a54dfa404bb01cbc6de2843d53dca435681f4b45dafd2f"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.818923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5a3b-account-delete-949mv" event={"ID":"5cae5c9e-9159-4e78-9809-1801d0e35131","Type":"ContainerStarted","Data":"1ffc4dad2b26cfe1658af060ac03b29a5bd8150a1c2d6396811081f31b4196d0"} Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.900923 4895 scope.go:117] "RemoveContainer" containerID="702d499c8eb77b3784f109189f3605813a2864341fb22907bb8c80622cf297f0" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.995035 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:48:50 crc kubenswrapper[4895]: I1202 07:48:50.996107 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.004618 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.008939 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.066299 4895 scope.go:117] "RemoveContainer" containerID="37e31157e4862cb11e5acde395d6bd0df5dd7a9d1818a4a02968a675039e325e" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111041 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle\") pod \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data\") pod \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111519 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle\") pod \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111556 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7g5b\" (UniqueName: \"kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b\") pod \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111589 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs\") pod \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111608 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data\") pod \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111734 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjbgq\" (UniqueName: \"kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq\") pod \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111763 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom\") pod \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\" (UID: \"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111891 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs\") pod \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.111925 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom\") pod \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\" (UID: \"5b34f139-ac6c-4a24-b478-c4563cce6a2c\") " Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.115582 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs" (OuterVolumeSpecName: "logs") pod "5b34f139-ac6c-4a24-b478-c4563cce6a2c" (UID: "5b34f139-ac6c-4a24-b478-c4563cce6a2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.122840 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs" (OuterVolumeSpecName: "logs") pod "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" (UID: "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.143309 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" (UID: "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.170766 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq" (OuterVolumeSpecName: "kube-api-access-vjbgq") pod "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" (UID: "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"). InnerVolumeSpecName "kube-api-access-vjbgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.170820 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b" (OuterVolumeSpecName: "kube-api-access-n7g5b") pod "5b34f139-ac6c-4a24-b478-c4563cce6a2c" (UID: "5b34f139-ac6c-4a24-b478-c4563cce6a2c"). InnerVolumeSpecName "kube-api-access-n7g5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.172454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5b34f139-ac6c-4a24-b478-c4563cce6a2c" (UID: "5b34f139-ac6c-4a24-b478-c4563cce6a2c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.181286 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b52e937-5b7e-4179-9766-20a9c2f93e35" path="/var/lib/kubelet/pods/5b52e937-5b7e-4179-9766-20a9c2f93e35/volumes" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.181932 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8206622d-b224-4744-9358-ad7c10d98ca1" path="/var/lib/kubelet/pods/8206622d-b224-4744-9358-ad7c10d98ca1/volumes" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.183025 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" path="/var/lib/kubelet/pods/93ac7640-b11c-48f4-b537-45bebe4af01b/volumes" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.184985 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3bcb64-db25-4f04-8624-af10542e9f10" path="/var/lib/kubelet/pods/9a3bcb64-db25-4f04-8624-af10542e9f10/volumes" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.189271 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c5f33d2-0416-40b5-8133-324aa1a60118" path="/var/lib/kubelet/pods/9c5f33d2-0416-40b5-8133-324aa1a60118/volumes" Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.218775 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.224492 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.226484 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7g5b\" (UniqueName: \"kubernetes.io/projected/5b34f139-ac6c-4a24-b478-c4563cce6a2c-kube-api-access-n7g5b\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.226596 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.226675 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjbgq\" (UniqueName: \"kubernetes.io/projected/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-kube-api-access-vjbgq\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.226783 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.226851 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b34f139-ac6c-4a24-b478-c4563cce6a2c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.226981 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.229887 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.230032 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.512323 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" (UID: "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.538583 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.624964 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.637065 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.637416 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:42774->10.217.0.200:8775: read: connection reset by peer" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.637511 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:42780->10.217.0.200:8775: read: connection reset by peer" Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.658252 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 07:48:51 crc kubenswrapper[4895]: E1202 07:48:51.658348 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerName="nova-scheduler-scheduler" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.710392 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b34f139-ac6c-4a24-b478-c4563cce6a2c" (UID: "5b34f139-ac6c-4a24-b478-c4563cce6a2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.737088 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data" (OuterVolumeSpecName: "config-data") pod "5b34f139-ac6c-4a24-b478-c4563cce6a2c" (UID: "5b34f139-ac6c-4a24-b478-c4563cce6a2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.747577 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:51 crc kubenswrapper[4895]: I1202 07:48:51.747611 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b34f139-ac6c-4a24-b478-c4563cce6a2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.808661 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-788d454954-brr26" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44360->10.217.0.161:9311: read: connection reset by peer" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.808646 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-788d454954-brr26" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:44370->10.217.0.161:9311: read: connection reset by peer" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.851375 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b3c2445-8bce-4d09-ad86-02c1ba6495fb" containerID="f9caf1a101a5817e21ca4e677736eb4472cb7fca73f00ec0f90330253ace0248" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.869106 4895 generic.go:334] "Generic (PLEG): container finished" podID="85e9e481-0762-42a8-a25a-7d50500f1236" containerID="c0d40bd925f15211d99af8cacd53d2e85f799a87ce053777a156c72dcd0fd1bc" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.870840 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data" (OuterVolumeSpecName: "config-data") pod "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" (UID: "e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.896473 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerID="1581cb4c4b70dcc4008550020a88177eb72fd5b2057dc2f0204082b9090480c2" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.898571 4895 generic.go:334] "Generic (PLEG): container finished" podID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerID="44ae8909515453d51c81fc2eab9723fc18e5cf8dc79ec16427db8d716e2d75dd" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.916935 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron5a3b-account-delete-949mv" podStartSLOduration=7.916902118 podStartE2EDuration="7.916902118s" podCreationTimestamp="2025-12-02 07:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:51.909269161 +0000 UTC m=+1543.080128784" watchObservedRunningTime="2025-12-02 07:48:51.916902118 +0000 UTC m=+1543.087761731" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.922328 4895 generic.go:334] "Generic (PLEG): container finished" podID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerID="3a0d36cdfb3f77e74dda0c49d0558e6c7571700d4bfd6cdaa1acbb5f35e6a972" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.932571 4895 generic.go:334] "Generic (PLEG): container finished" podID="e4869eb0-5e33-4837-8295-06ca17076e69" containerID="2404d0d162ba97497121e295a4d0041b66d86ff11fa14a769019cf11872671c2" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.941706 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican4aa4-account-delete-pvmbl" podStartSLOduration=6.941678545 podStartE2EDuration="6.941678545s" podCreationTimestamp="2025-12-02 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:51.936302109 +0000 UTC m=+1543.107161722" watchObservedRunningTime="2025-12-02 07:48:51.941678545 +0000 UTC m=+1543.112538178" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.958196 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.972381 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b969f4967-hmqp8" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.990057 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-54957dcd96-7sx87" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.150:8778/\": read tcp 10.217.0.2:57560->10.217.0.150:8778: read: connection reset by peer" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.990515 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-54957dcd96-7sx87" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.150:8778/\": read tcp 10.217.0.2:57564->10.217.0.150:8778: read: connection reset by peer" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:51.994480 4895 generic.go:334] "Generic (PLEG): container finished" podID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerID="8a2f68e796838e87e45698b40183c455794d730caf5af19a07c35fd150b09fe1" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.012416 4895 generic.go:334] "Generic (PLEG): container finished" podID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerID="973ab025884cab7054f146e0f744a06e1f4e800f6c16521085496ffc96503509" exitCode=0 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.015907 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:48:52 crc kubenswrapper[4895]: W1202 07:48:52.125937 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ebf9714_5e6d_415c_a0aa_adab0d3e46e9.slice/crio-a22f4c905d0801dd52d8abeba3d1d1cda84aa90931f8f427a2eaefb256ac2937 WatchSource:0}: Error finding container a22f4c905d0801dd52d8abeba3d1d1cda84aa90931f8f427a2eaefb256ac2937: Status 404 returned error can't find the container with id a22f4c905d0801dd52d8abeba3d1d1cda84aa90931f8f427a2eaefb256ac2937 Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.466310 4895 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.324s" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.466981 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5583-account-delete-xm6hg" event={"ID":"6b3c2445-8bce-4d09-ad86-02c1ba6495fb","Type":"ContainerDied","Data":"f9caf1a101a5817e21ca4e677736eb4472cb7fca73f00ec0f90330253ace0248"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467017 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467057 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerDied","Data":"c0d40bd925f15211d99af8cacd53d2e85f799a87ce053777a156c72dcd0fd1bc"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467075 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f6974886f-mmsbz" event={"ID":"85e9e481-0762-42a8-a25a-7d50500f1236","Type":"ContainerDied","Data":"3c171dd2f5f04681f363b255580423e1255efef85f62c66d8427210f76e945e1"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467338 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c171dd2f5f04681f363b255580423e1255efef85f62c66d8427210f76e945e1" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467376 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5a3b-account-delete-949mv" event={"ID":"5cae5c9e-9159-4e78-9809-1801d0e35131","Type":"ContainerStarted","Data":"886e593440f5f547e624e04c372422144b2af46990afd1b6c63c56f2dacb354f"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467397 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerDied","Data":"1581cb4c4b70dcc4008550020a88177eb72fd5b2057dc2f0204082b9090480c2"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467419 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerDied","Data":"44ae8909515453d51c81fc2eab9723fc18e5cf8dc79ec16427db8d716e2d75dd"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467440 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican4aa4-account-delete-pvmbl" event={"ID":"d42411e0-2228-4a1a-9d31-e3788f2b1f0c","Type":"ContainerStarted","Data":"9feda8f6a8375fc053369762aade6135c47ba96ba66d70d924ce2840072589a8"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican4aa4-account-delete-pvmbl" event={"ID":"d42411e0-2228-4a1a-9d31-e3788f2b1f0c","Type":"ContainerStarted","Data":"09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467471 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"183c5216-30f9-4f75-865b-7f795ea149fb","Type":"ContainerDied","Data":"45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467487 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45048d305695bd4e78872a121b29efa32be5590ce3404fe1e6a4b774a3633a98" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerDied","Data":"3a0d36cdfb3f77e74dda0c49d0558e6c7571700d4bfd6cdaa1acbb5f35e6a972"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ebbed0ba-1d44-4421-a276-b075b0f31c3f","Type":"ContainerDied","Data":"afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467522 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc2870a980ad288b478e5ac470edfdd3b1fd8aa6c287e4f1ad3e4aed3e85d81" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467532 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerDied","Data":"2404d0d162ba97497121e295a4d0041b66d86ff11fa14a769019cf11872671c2"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467547 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b7d1-account-delete-wchwk" event={"ID":"5696e7d9-103a-4bf7-9b05-1959e92cf46a","Type":"ContainerStarted","Data":"25f7933f00f9b17f29235bd1a7b5edcd4d3fbcb2a26e043877c66f025f2ac33d"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467562 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementa8bc-account-delete-jz5nc" event={"ID":"7067a12f-0245-45f5-a806-591d5999c7f0","Type":"ContainerStarted","Data":"ef6521d9b2bbb3f545c7a699a0a97be4fbbe9fbc46696c6a6287b9e2ee4ce0a8"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467576 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi23cb-account-delete-g8msv" event={"ID":"96831697-ba2e-477e-954f-e4ad0cf30f92","Type":"ContainerStarted","Data":"6a7ef559b071bec63c6a6f0c36f0541136299e206893522ad1b6e213a924da0a"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467589 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b969f4967-hmqp8" event={"ID":"5b34f139-ac6c-4a24-b478-c4563cce6a2c","Type":"ContainerDied","Data":"bcfc57872432b827c048052a8d3a082a203e58df3d47f686b28c5e697df59acc"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerDied","Data":"8a2f68e796838e87e45698b40183c455794d730caf5af19a07c35fd150b09fe1"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ace60b46-ed73-43ba-8d95-b81b03a6bd0a","Type":"ContainerDied","Data":"ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467630 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc2b962d7c7f4ed2514cc2330646801262c0beffbad1e391ff22f88fe90cf93" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467640 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7d85-account-delete-j8sgc" event={"ID":"f28e5fd3-456b-4960-a3a9-1134e3eecb1f","Type":"ContainerStarted","Data":"b69fdaed99708c1282b6d6c9ebd2f76e8972907d394090d23baa40f074e3ff2c"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467653 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerDied","Data":"973ab025884cab7054f146e0f744a06e1f4e800f6c16521085496ffc96503509"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467668 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" event={"ID":"e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f","Type":"ContainerDied","Data":"1206a45e4999e55a0b5d421860da4163e9db962fe077072543287e4b6ba17c1f"} Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.467694 4895 scope.go:117] "RemoveContainer" containerID="7d6a5cbf4ac42d7b9bcb1f16b7d852ad3e604b72c9dfa43a52ca193e0d0f7f4e" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.469399 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-central-agent" containerID="cri-o://a18e722962390f2024c510ade1f26e4551f58f4c4c7c9b941662a44001c505ea" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.469731 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-notification-agent" containerID="cri-o://9b1129b02fb76389880616b0a4f07ba64c625c630960d277f41251bdc884c35b" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.469749 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="sg-core" containerID="cri-o://24c551cd8bbb34832b5693a91b97f7fc6d801619091d62b54a02c1b5f9bcbd45" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.469920 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="proxy-httpd" containerID="cri-o://c51c9cadb8af000c2a708fd441d7a16102397aef9d4301d9ddb87d8386fc6024" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.475211 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.475263 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data podName:0d1cb194-5325-40c2-bbd4-0a48821e12aa nodeName:}" failed. No retries permitted until 2025-12-02 07:49:00.47524595 +0000 UTC m=+1551.646105563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data") pod "rabbitmq-cell1-server-0" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa") : configmap "rabbitmq-cell1-config-data" not found Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.560629 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.583028 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" containerName="kube-state-metrics" containerID="cri-o://7dc2853c20a38045953efd3752aa502543cbbe08dd450481c9d49ada9a7e28ab" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.680965 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.681259 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="b15097a8-ac9a-4886-a839-272b662561c5" containerName="memcached" containerID="cri-o://749c0f6ea01ac411d0209d4472b7bd79cfc38bd8f584ebdd6968b35f5d12cdc7" gracePeriod=30 Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.712594 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.728360 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.751924 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.752014 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.795821 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xggw9"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.862819 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xggw9"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.925347 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tx72w"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.950500 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tx72w"] Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.965364 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystonea3d7-account-delete-jgvsf"] Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974296 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener-log" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974331 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener-log" Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974353 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker-log" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974359 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker-log" Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974368 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="ovsdbserver-nb" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974374 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="ovsdbserver-nb" Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974402 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="openstack-network-exporter" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974408 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="openstack-network-exporter" Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974422 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974428 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener" Dec 02 07:48:52 crc kubenswrapper[4895]: E1202 07:48:52.974436 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974441 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974701 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener-log" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974713 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="ovsdbserver-nb" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974719 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ac7640-b11c-48f4-b537-45bebe4af01b" containerName="openstack-network-exporter" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974732 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974760 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" containerName="barbican-worker-log" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.974781 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" containerName="barbican-keystone-listener" Dec 02 07:48:52 crc kubenswrapper[4895]: I1202 07:48:52.975512 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.005192 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.005493 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-56dbdc9bc-kgkw2" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" containerName="keystone-api" containerID="cri-o://fe38dca9f6627e9e19b2be20b54cb47cb1aee5e491dae454c261bcbe08243752" gracePeriod=30 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.050380 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.073268 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.081059 4895 generic.go:334] "Generic (PLEG): container finished" podID="d42411e0-2228-4a1a-9d31-e3788f2b1f0c" containerID="9feda8f6a8375fc053369762aade6135c47ba96ba66d70d924ce2840072589a8" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.081201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican4aa4-account-delete-pvmbl" event={"ID":"d42411e0-2228-4a1a-9d31-e3788f2b1f0c","Type":"ContainerDied","Data":"9feda8f6a8375fc053369762aade6135c47ba96ba66d70d924ce2840072589a8"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.093267 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystonea3d7-account-delete-jgvsf"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.101218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.101445 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.117952 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hzt8n"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.137323 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonea3d7-account-delete-jgvsf"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.138473 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerID="8f6e08f059d8d10b34bda28a99cf993bc10f7153af260e881771ef9437a89f77" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.138612 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerDied","Data":"8f6e08f059d8d10b34bda28a99cf993bc10f7153af260e881771ef9437a89f77"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.164907 4895 generic.go:334] "Generic (PLEG): container finished" podID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerID="79507980e01b07ea773d434932da83cc407f386cc2f4f05c605e4f8341d7bef2" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.168092 4895 generic.go:334] "Generic (PLEG): container finished" podID="f28e5fd3-456b-4960-a3a9-1134e3eecb1f" containerID="b69fdaed99708c1282b6d6c9ebd2f76e8972907d394090d23baa40f074e3ff2c" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.184286 4895 generic.go:334] "Generic (PLEG): container finished" podID="5cae5c9e-9159-4e78-9809-1801d0e35131" containerID="886e593440f5f547e624e04c372422144b2af46990afd1b6c63c56f2dacb354f" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.198105 4895 generic.go:334] "Generic (PLEG): container finished" podID="7067a12f-0245-45f5-a806-591d5999c7f0" containerID="ef6521d9b2bbb3f545c7a699a0a97be4fbbe9fbc46696c6a6287b9e2ee4ce0a8" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.202426 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs\") pod \"183c5216-30f9-4f75-865b-7f795ea149fb\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.202549 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data\") pod \"183c5216-30f9-4f75-865b-7f795ea149fb\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.202618 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs\") pod \"183c5216-30f9-4f75-865b-7f795ea149fb\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.204386 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle\") pod \"183c5216-30f9-4f75-865b-7f795ea149fb\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.204534 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9rzn\" (UniqueName: \"kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn\") pod \"183c5216-30f9-4f75-865b-7f795ea149fb\" (UID: \"183c5216-30f9-4f75-865b-7f795ea149fb\") " Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.204956 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.205145 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.205379 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.205466 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:53.705446905 +0000 UTC m=+1544.876306518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : configmap "openstack-scripts" not found Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.218022 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn" (OuterVolumeSpecName: "kube-api-access-j9rzn") pod "183c5216-30f9-4f75-865b-7f795ea149fb" (UID: "183c5216-30f9-4f75-865b-7f795ea149fb"). InnerVolumeSpecName "kube-api-access-j9rzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.221333 4895 projected.go:194] Error preparing data for projected volume kube-api-access-8zdjw for pod openstack/keystonea3d7-account-delete-jgvsf: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.221428 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:53.72140569 +0000 UTC m=+1544.892265303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8zdjw" (UniqueName: "kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.245387 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d15bc9-7912-4eab-9c22-23630caecbb4" path="/var/lib/kubelet/pods/e7d15bc9-7912-4eab-9c22-23630caecbb4/volumes" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.246117 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfcde89-8c82-43c8-b59b-4145640a2737" path="/var/lib/kubelet/pods/ecfcde89-8c82-43c8-b59b-4145640a2737/volumes" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.300073 4895 generic.go:334] "Generic (PLEG): container finished" podID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" containerID="7dc2853c20a38045953efd3752aa502543cbbe08dd450481c9d49ada9a7e28ab" exitCode=2 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.305717 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "183c5216-30f9-4f75-865b-7f795ea149fb" (UID: "183c5216-30f9-4f75-865b-7f795ea149fb"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.307641 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9rzn\" (UniqueName: \"kubernetes.io/projected/183c5216-30f9-4f75-865b-7f795ea149fb-kube-api-access-j9rzn\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.307678 4895 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.359657 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data" (OuterVolumeSpecName: "config-data") pod "183c5216-30f9-4f75-865b-7f795ea149fb" (UID: "183c5216-30f9-4f75-865b-7f795ea149fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.372595 4895 generic.go:334] "Generic (PLEG): container finished" podID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerID="e21126490e30d0f2abdaa9c6468d800825eee8d11c80c4baee6ce5e501917408" exitCode=0 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.396688 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "183c5216-30f9-4f75-865b-7f795ea149fb" (UID: "183c5216-30f9-4f75-865b-7f795ea149fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.406388 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "183c5216-30f9-4f75-865b-7f795ea149fb" (UID: "183c5216-30f9-4f75-865b-7f795ea149fb"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.419219 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.419259 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.419272 4895 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/183c5216-30f9-4f75-865b-7f795ea149fb-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.522949 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="galera" containerID="cri-o://6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" gracePeriod=30 Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.715405 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a3d7-account-create-update-dmv94"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerDied","Data":"79507980e01b07ea773d434932da83cc407f386cc2f4f05c605e4f8341d7bef2"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716066 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a3d7-account-create-update-dmv94"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7d85-account-delete-j8sgc" event={"ID":"f28e5fd3-456b-4960-a3a9-1134e3eecb1f","Type":"ContainerDied","Data":"b69fdaed99708c1282b6d6c9ebd2f76e8972907d394090d23baa40f074e3ff2c"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5a3b-account-delete-949mv" event={"ID":"5cae5c9e-9159-4e78-9809-1801d0e35131","Type":"ContainerDied","Data":"886e593440f5f547e624e04c372422144b2af46990afd1b6c63c56f2dacb354f"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716132 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hzt8n"] Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716159 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementa8bc-account-delete-jz5nc" event={"ID":"7067a12f-0245-45f5-a806-591d5999c7f0","Type":"ContainerDied","Data":"ef6521d9b2bbb3f545c7a699a0a97be4fbbe9fbc46696c6a6287b9e2ee4ce0a8"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716173 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerStarted","Data":"a22f4c905d0801dd52d8abeba3d1d1cda84aa90931f8f427a2eaefb256ac2937"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716189 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a3ec758-e19e-4286-bfed-a1d6d3010bfb","Type":"ContainerDied","Data":"7dc2853c20a38045953efd3752aa502543cbbe08dd450481c9d49ada9a7e28ab"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.716203 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e4869eb0-5e33-4837-8295-06ca17076e69","Type":"ContainerDied","Data":"9238f65472050ea35e994123a598f3005e48f17205c9891944f602d0eee17fa9"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.720663 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9238f65472050ea35e994123a598f3005e48f17205c9891944f602d0eee17fa9" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.720680 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerDied","Data":"e21126490e30d0f2abdaa9c6468d800825eee8d11c80c4baee6ce5e501917408"} Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.732591 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.732768 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.732938 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.732996 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:54.732978013 +0000 UTC m=+1545.903837626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : configmap "openstack-scripts" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.733591 4895 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.733616 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data podName:ca98cba7-4127-4d25-a139-1a42224331f2 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:01.733608193 +0000 UTC m=+1552.904467806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data") pod "rabbitmq-server-0" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2") : configmap "rabbitmq-config-data" not found Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.773452 4895 scope.go:117] "RemoveContainer" containerID="68d4a2538c6c04477ff11aefd007fcd9450afdb38ccbda6a64db4e5f865071b1" Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.773851 4895 projected.go:194] Error preparing data for projected volume kube-api-access-8zdjw for pod openstack/keystonea3d7-account-delete-jgvsf: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:53 crc kubenswrapper[4895]: E1202 07:48:53.773923 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:54.77390286 +0000 UTC m=+1545.944762473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8zdjw" (UniqueName: "kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.945128 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 07:48:53 crc kubenswrapper[4895]: I1202 07:48:53.946288 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="b15097a8-ac9a-4886-a839-272b662561c5" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.104:11211: connect: connection refused" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042556 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042586 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042612 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042724 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042798 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042899 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.042936 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q684\" (UniqueName: \"kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.045319 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.046138 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.048502 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.049718 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.066669 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684" (OuterVolumeSpecName: "kube-api-access-6q684") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "kube-api-access-6q684". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.072040 4895 scope.go:117] "RemoveContainer" containerID="87a341d01cbe5679c7f66108701ad133b21f9226861ceb315e694aa0b420673a" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.110798 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.131834 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.136634 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-b969f4967-hmqp8"] Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.140243 4895 scope.go:117] "RemoveContainer" containerID="1ea05e687809a1075b370d099e40ef305622b4839ae26a0439d53df787025e36" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.140536 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.144576 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.147055 4895 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.147086 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q684\" (UniqueName: \"kubernetes.io/projected/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-kube-api-access-6q684\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.147099 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.147110 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.147121 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.161372 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.161524 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.167140 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.168527 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.168568 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.209720 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8zdjw operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystonea3d7-account-delete-jgvsf" podUID="ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.222680 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.243246 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254432 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254508 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg4lr\" (UniqueName: \"kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254559 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254605 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254634 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254688 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254719 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254806 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254835 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xnrz\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254868 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.254918 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255010 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255037 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255067 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255122 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255148 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwtlc\" (UniqueName: \"kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255183 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255205 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255286 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run\") pod \"e4869eb0-5e33-4837-8295-06ca17076e69\" (UID: \"e4869eb0-5e33-4837-8295-06ca17076e69\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255308 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255332 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255390 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle\") pod \"85e9e481-0762-42a8-a25a-7d50500f1236\" (UID: \"85e9e481-0762-42a8-a25a-7d50500f1236\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.255434 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs\") pod \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\" (UID: \"ebbed0ba-1d44-4421-a276-b075b0f31c3f\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.256138 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.262577 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.262793 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.265203 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.281237 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.292724 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs" (OuterVolumeSpecName: "logs") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.295484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs" (OuterVolumeSpecName: "logs") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.296900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc" (OuterVolumeSpecName: "kube-api-access-qwtlc") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "kube-api-access-qwtlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.298449 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.303132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.320005 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.345797 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts" (OuterVolumeSpecName: "scripts") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.346024 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr" (OuterVolumeSpecName: "kube-api-access-xg4lr") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "kube-api-access-xg4lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.348062 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts" (OuterVolumeSpecName: "scripts") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.367376 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjnqw\" (UniqueName: \"kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw\") pod \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.367578 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.367652 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssqkl\" (UniqueName: \"kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368220 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368307 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368556 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368661 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxm6c\" (UniqueName: \"kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368815 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368853 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.368897 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.369253 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.369981 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.370027 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts\") pod \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\" (UID: \"6b3c2445-8bce-4d09-ad86-02c1ba6495fb\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.370092 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.370415 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs\") pod \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\" (UID: \"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.370514 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run\") pod \"290c1303-bf41-4474-86ff-c9f5aa105cc3\" (UID: \"290c1303-bf41-4474-86ff-c9f5aa105cc3\") " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372393 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372453 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372471 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwtlc\" (UniqueName: \"kubernetes.io/projected/ebbed0ba-1d44-4421-a276-b075b0f31c3f-kube-api-access-qwtlc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372485 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372499 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372533 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372546 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg4lr\" (UniqueName: \"kubernetes.io/projected/e4869eb0-5e33-4837-8295-06ca17076e69-kube-api-access-xg4lr\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372558 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4869eb0-5e33-4837-8295-06ca17076e69-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372568 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372578 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ebbed0ba-1d44-4421-a276-b075b0f31c3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372590 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85e9e481-0762-42a8-a25a-7d50500f1236-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.372600 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebbed0ba-1d44-4421-a276-b075b0f31c3f-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.374752 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.378802 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b3c2445-8bce-4d09-ad86-02c1ba6495fb" (UID: "6b3c2445-8bce-4d09-ad86-02c1ba6495fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.380801 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs" (OuterVolumeSpecName: "logs") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.381079 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs" (OuterVolumeSpecName: "logs") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.389051 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.394192 4895 generic.go:334] "Generic (PLEG): container finished" podID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerID="cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.394276 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d08915b6-6f79-40e4-8c26-d9f82606b4cc","Type":"ContainerDied","Data":"cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.394465 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz" (OuterVolumeSpecName: "kube-api-access-7xnrz") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "kube-api-access-7xnrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.396113 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54957dcd96-7sx87" event={"ID":"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3","Type":"ContainerDied","Data":"1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.396138 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd723803b0bb7df564099d3f9f177aaf9565eb4053b96a52a29a416703f1444" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.397254 4895 generic.go:334] "Generic (PLEG): container finished" podID="b15097a8-ac9a-4886-a839-272b662561c5" containerID="749c0f6ea01ac411d0209d4472b7bd79cfc38bd8f584ebdd6968b35f5d12cdc7" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.397292 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b15097a8-ac9a-4886-a839-272b662561c5","Type":"ContainerDied","Data":"749c0f6ea01ac411d0209d4472b7bd79cfc38bd8f584ebdd6968b35f5d12cdc7"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.398676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementa8bc-account-delete-jz5nc" event={"ID":"7067a12f-0245-45f5-a806-591d5999c7f0","Type":"ContainerDied","Data":"fb26a44230d333a13630d130d653b763d7dcb89f113e8876425cd193ebb107d4"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.398726 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb26a44230d333a13630d130d653b763d7dcb89f113e8876425cd193ebb107d4" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401759 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerID="c51c9cadb8af000c2a708fd441d7a16102397aef9d4301d9ddb87d8386fc6024" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401784 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerID="24c551cd8bbb34832b5693a91b97f7fc6d801619091d62b54a02c1b5f9bcbd45" exitCode=2 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401792 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerID="a18e722962390f2024c510ade1f26e4551f58f4c4c7c9b941662a44001c505ea" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerDied","Data":"c51c9cadb8af000c2a708fd441d7a16102397aef9d4301d9ddb87d8386fc6024"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerDied","Data":"24c551cd8bbb34832b5693a91b97f7fc6d801619091d62b54a02c1b5f9bcbd45"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.401848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerDied","Data":"a18e722962390f2024c510ade1f26e4551f58f4c4c7c9b941662a44001c505ea"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.407157 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.407649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.418960 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"290c1303-bf41-4474-86ff-c9f5aa105cc3","Type":"ContainerDied","Data":"af259c450ee7d7673b0fbe89cc10ca606d9ab65f5a9afd56072b19e32ed4be8c"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.419046 4895 scope.go:117] "RemoveContainer" containerID="973ab025884cab7054f146e0f744a06e1f4e800f6c16521085496ffc96503509" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.419153 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw" (OuterVolumeSpecName: "kube-api-access-hjnqw") pod "6b3c2445-8bce-4d09-ad86-02c1ba6495fb" (UID: "6b3c2445-8bce-4d09-ad86-02c1ba6495fb"). InnerVolumeSpecName "kube-api-access-hjnqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.419297 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.420936 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts" (OuterVolumeSpecName: "scripts") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.430381 4895 generic.go:334] "Generic (PLEG): container finished" podID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerID="825f000e90e467b37377e382a45ce9ec58ad6ced7e5a761f9a5ac0cc1b0ded3d" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.430465 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerDied","Data":"825f000e90e467b37377e382a45ce9ec58ad6ced7e5a761f9a5ac0cc1b0ded3d"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.434259 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl" (OuterVolumeSpecName: "kube-api-access-ssqkl") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "kube-api-access-ssqkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.439372 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance5583-account-delete-xm6hg" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.439569 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5583-account-delete-xm6hg" event={"ID":"6b3c2445-8bce-4d09-ad86-02c1ba6495fb","Type":"ContainerDied","Data":"446dfc1cb253a69c411beac7da8e5a22f44d1cea25cb336c6af447ebccd97a50"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.439631 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="446dfc1cb253a69c411beac7da8e5a22f44d1cea25cb336c6af447ebccd97a50" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.455990 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c" (OuterVolumeSpecName: "kube-api-access-pxm6c") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "kube-api-access-pxm6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.456174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b","Type":"ContainerDied","Data":"ca9480eb873cb42c7438f27a0592e23bd26270b5d0d893aa3b2e61758d9f0968"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.456223 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9480eb873cb42c7438f27a0592e23bd26270b5d0d893aa3b2e61758d9f0968" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.458331 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.466817 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.473668 4895 generic.go:334] "Generic (PLEG): container finished" podID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerID="bd9e831f88d074ed4ebcb3f0c21947564533211ce824af698b616217e7b83e86" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.474183 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerDied","Data":"bd9e831f88d074ed4ebcb3f0c21947564533211ce824af698b616217e7b83e86"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.476449 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0a3ec758-e19e-4286-bfed-a1d6d3010bfb","Type":"ContainerDied","Data":"8bdd186a6b7674f05fab18e7c56a6b0e62a67b5a53a50271787e7d6eeeda8493"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.476585 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bdd186a6b7674f05fab18e7c56a6b0e62a67b5a53a50271787e7d6eeeda8493" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.477391 4895 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478326 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjnqw\" (UniqueName: \"kubernetes.io/projected/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-kube-api-access-hjnqw\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478373 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478389 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478420 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssqkl\" (UniqueName: \"kubernetes.io/projected/290c1303-bf41-4474-86ff-c9f5aa105cc3-kube-api-access-ssqkl\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478433 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478450 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478462 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxm6c\" (UniqueName: \"kubernetes.io/projected/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-kube-api-access-pxm6c\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478524 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478540 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xnrz\" (UniqueName: \"kubernetes.io/projected/85e9e481-0762-42a8-a25a-7d50500f1236-kube-api-access-7xnrz\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478552 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478567 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b3c2445-8bce-4d09-ad86-02c1ba6495fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478609 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.478698 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/290c1303-bf41-4474-86ff-c9f5aa105cc3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.480357 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-788d454954-brr26" event={"ID":"e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07","Type":"ContainerDied","Data":"1cc3f46f7c91409910c521462a433b139a38e5268d48fc41e8dc3a7977ee1078"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.480478 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-788d454954-brr26" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.489978 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b7d1-account-delete-wchwk" event={"ID":"5696e7d9-103a-4bf7-9b05-1959e92cf46a","Type":"ContainerStarted","Data":"0c2c388094b95cef4d9070468d30cc3bb7a5071f95547b2ee0b18119aa7ce3f9"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.491291 4895 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0b7d1-account-delete-wchwk" secret="" err="secret \"galera-openstack-dockercfg-jskc8\" not found" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.492897 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder7d85-account-delete-j8sgc" event={"ID":"f28e5fd3-456b-4960-a3a9-1134e3eecb1f","Type":"ContainerDied","Data":"3cbef3cc9d8370b2c106b6a1f67d7cde1be7e59e73377a5eabb4fc8c2688067a"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.493008 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbef3cc9d8370b2c106b6a1f67d7cde1be7e59e73377a5eabb4fc8c2688067a" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.518590 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerID="cc3ec4d62ef18a3145302b1f913c2b11bc95cfa5e826aece7c00bbdc8aea0e34" exitCode=0 Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.518733 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerDied","Data":"cc3ec4d62ef18a3145302b1f913c2b11bc95cfa5e826aece7c00bbdc8aea0e34"} Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.587257 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.587323 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:48:55.087306373 +0000 UTC m=+1546.258165986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.593494 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0b7d1-account-delete-wchwk" podStartSLOduration=8.593472503 podStartE2EDuration="8.593472503s" podCreationTimestamp="2025-12-02 07:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:54.58853406 +0000 UTC m=+1545.759393673" watchObservedRunningTime="2025-12-02 07:48:54.593472503 +0000 UTC m=+1545.764332116" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.605691 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerStarted","Data":"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.647397 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f762a68c-cabc-4842-844a-1db6710e3ee9","Type":"ContainerDied","Data":"58bc85987b637dd3201cd07bd859b57218cf7cf9d0e7867f0d422f70d7b00677"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.647781 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58bc85987b637dd3201cd07bd859b57218cf7cf9d0e7867f0d422f70d7b00677" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.691139 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vpwvp" podStartSLOduration=7.976293887 podStartE2EDuration="11.691110208s" podCreationTimestamp="2025-12-02 07:48:43 +0000 UTC" firstStartedPulling="2025-12-02 07:48:47.510672723 +0000 UTC m=+1538.681532336" lastFinishedPulling="2025-12-02 07:48:51.225489044 +0000 UTC m=+1542.396348657" observedRunningTime="2025-12-02 07:48:54.689287881 +0000 UTC m=+1545.860147514" watchObservedRunningTime="2025-12-02 07:48:54.691110208 +0000 UTC m=+1545.861969831" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.695101 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.696780 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.702523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi23cb-account-delete-g8msv" event={"ID":"96831697-ba2e-477e-954f-e4ad0cf30f92","Type":"ContainerStarted","Data":"0ea2b37615e5717b134e70582d30afd9a8248506c11d128a320a1ec2c2f21f39"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.703603 4895 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi23cb-account-delete-g8msv" secret="" err="secret \"galera-openstack-dockercfg-jskc8\" not found" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.709952 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.710054 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ddf8948cc-h2bbh" event={"ID":"ab5ec753-410a-4d4b-8071-ce60970ba4df","Type":"ContainerDied","Data":"e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46"} Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.710097 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e47f6a84140bc13b1f6bdc81fbcae924a4055e6a0a8c633a5874fb7f744bfb46" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.710645 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.711361 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f6974886f-mmsbz" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.711557 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.711909 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.712226 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.761440 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi23cb-account-delete-g8msv" podStartSLOduration=9.761412875 podStartE2EDuration="9.761412875s" podCreationTimestamp="2025-12-02 07:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:54.724969356 +0000 UTC m=+1545.895828979" watchObservedRunningTime="2025-12-02 07:48:54.761412875 +0000 UTC m=+1545.932272488" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.795837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.795979 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.796220 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.796236 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.796716 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.797375 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:56.797238294 +0000 UTC m=+1547.968097907 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.799057 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.799176 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:55.299143473 +0000 UTC m=+1546.470003086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.805523 4895 projected.go:194] Error preparing data for projected volume kube-api-access-8zdjw for pod openstack/keystonea3d7-account-delete-jgvsf: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.805633 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:56.805608574 +0000 UTC m=+1547.976468367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8zdjw" (UniqueName: "kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.838370 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.871641 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.872071 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.872353 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.872405 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.875944 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.877600 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.877717 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-skg8w"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.884004 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:54 crc kubenswrapper[4895]: E1202 07:48:54.884113 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.904127 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-skg8w"] Dec 02 07:48:54 crc kubenswrapper[4895]: I1202 07:48:54.913533 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.001547 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.011458 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5a3b-account-create-update-ztphx"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.037890 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5a3b-account-create-update-ztphx"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.050723 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.053021 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data" (OuterVolumeSpecName: "config-data") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.087987 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.094023 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data" (OuterVolumeSpecName: "config-data") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.108042 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.109903 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.114349 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.116952 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.117687 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") pod \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\" (UID: \"ace60b46-ed73-43ba-8d95-b81b03a6bd0a\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118641 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118660 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118678 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118690 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118703 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118716 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.118762 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: E1202 07:48:55.118864 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:55 crc kubenswrapper[4895]: E1202 07:48:55.118932 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:48:56.118910327 +0000 UTC m=+1547.289769940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:48:55 crc kubenswrapper[4895]: W1202 07:48:55.119083 4895 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ace60b46-ed73-43ba-8d95-b81b03a6bd0a/volumes/kubernetes.io~secret/galera-tls-certs Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.119127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ace60b46-ed73-43ba-8d95-b81b03a6bd0a" (UID: "ace60b46-ed73-43ba-8d95-b81b03a6bd0a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.132251 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.175200 4895 scope.go:117] "RemoveContainer" containerID="fd8c7d4e19097367de3d3f49094033e0adeb083a5427064f86bcdaba564bc61c" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.176202 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.177044 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.183147 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.185972 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.190623 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data" (OuterVolumeSpecName: "config-data") pod "e4869eb0-5e33-4837-8295-06ca17076e69" (UID: "e4869eb0-5e33-4837-8295-06ca17076e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.198592 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data" (OuterVolumeSpecName: "config-data") pod "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" (UID: "e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.209327 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e30fe62-fac0-425f-ba6f-277033d652d1" path="/var/lib/kubelet/pods/2e30fe62-fac0-425f-ba6f-277033d652d1/volumes" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.210477 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b34f139-ac6c-4a24-b478-c4563cce6a2c" path="/var/lib/kubelet/pods/5b34f139-ac6c-4a24-b478-c4563cce6a2c/volumes" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.212608 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.212608 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.213342 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="936e9805-90f9-43dd-ad0c-f248ea86a3c5" path="/var/lib/kubelet/pods/936e9805-90f9-43dd-ad0c-f248ea86a3c5/volumes" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.214911 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8" path="/var/lib/kubelet/pods/ce68a83d-fdca-4ae6-8e1b-ab7dffb77bb8/volumes" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.216001 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eec7076a-ba39-484b-9c7f-4eb78d449de2" path="/var/lib/kubelet/pods/eec7076a-ba39-484b-9c7f-4eb78d449de2/volumes" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220433 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220469 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220483 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4869eb0-5e33-4837-8295-06ca17076e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220497 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220512 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220524 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220535 4895 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace60b46-ed73-43ba-8d95-b81b03a6bd0a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.220547 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.258088 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.268138 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.288557 4895 scope.go:117] "RemoveContainer" containerID="8f6e08f059d8d10b34bda28a99cf993bc10f7153af260e881771ef9437a89f77" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.291097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebbed0ba-1d44-4421-a276-b075b0f31c3f" (UID: "ebbed0ba-1d44-4421-a276-b075b0f31c3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339196 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339283 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle\") pod \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339392 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339453 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339515 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs\") pod \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339663 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdqn2\" (UniqueName: \"kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2\") pod \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339720 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339804 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.339862 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340086 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340119 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs\") pod \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340168 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfld8\" (UniqueName: \"kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8\") pod \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\" (UID: \"68bddf66-0b9f-4bc8-916b-aa0abfbf13c3\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340247 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle\") pod \"ab5ec753-410a-4d4b-8071-ce60970ba4df\" (UID: \"ab5ec753-410a-4d4b-8071-ce60970ba4df\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340266 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data\") pod \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\" (UID: \"2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.340790 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebbed0ba-1d44-4421-a276-b075b0f31c3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: E1202 07:48:55.340875 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:55 crc kubenswrapper[4895]: E1202 07:48:55.340975 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:56.340950103 +0000 UTC m=+1547.511809716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.342183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs" (OuterVolumeSpecName: "logs") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.380725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs" (OuterVolumeSpecName: "logs") pod "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" (UID: "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.424835 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8" (OuterVolumeSpecName: "kube-api-access-qfld8") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "kube-api-access-qfld8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.434510 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5" (OuterVolumeSpecName: "kube-api-access-sjzj5") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "kube-api-access-sjzj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.434550 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443089 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443203 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443389 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443809 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jh6f\" (UniqueName: \"kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.443847 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs\") pod \"f762a68c-cabc-4842-844a-1db6710e3ee9\" (UID: \"f762a68c-cabc-4842-844a-1db6710e3ee9\") " Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.444870 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.444913 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.444951 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/ab5ec753-410a-4d4b-8071-ce60970ba4df-kube-api-access-sjzj5\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.444970 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfld8\" (UniqueName: \"kubernetes.io/projected/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-kube-api-access-qfld8\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.444978 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.445670 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs" (OuterVolumeSpecName: "logs") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.448112 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts" (OuterVolumeSpecName: "scripts") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.474576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f" (OuterVolumeSpecName: "kube-api-access-2jh6f") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "kube-api-access-2jh6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.504942 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2" (OuterVolumeSpecName: "kube-api-access-vdqn2") pod "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" (UID: "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b"). InnerVolumeSpecName "kube-api-access-vdqn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.546798 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f762a68c-cabc-4842-844a-1db6710e3ee9-logs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.546827 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jh6f\" (UniqueName: \"kubernetes.io/projected/f762a68c-cabc-4842-844a-1db6710e3ee9-kube-api-access-2jh6f\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.546838 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdqn2\" (UniqueName: \"kubernetes.io/projected/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-kube-api-access-vdqn2\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.546846 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.661368 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data" (OuterVolumeSpecName: "config-data") pod "85e9e481-0762-42a8-a25a-7d50500f1236" (UID: "85e9e481-0762-42a8-a25a-7d50500f1236"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.666258 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data" (OuterVolumeSpecName: "config-data") pod "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" (UID: "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.673306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.691938 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data" (OuterVolumeSpecName: "config-data") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.732921 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.732960 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.732978 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lgt57"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.732990 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lgt57"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733003 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7d85-account-create-update-bdsdf"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733014 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733029 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7d85-account-create-update-bdsdf"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733053 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mvm85"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733078 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mvm85"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733091 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a8bc-account-create-update-8tvzs"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733101 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.733112 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a8bc-account-create-update-8tvzs"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.749056 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.751389 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b15097a8-ac9a-4886-a839-272b662561c5","Type":"ContainerDied","Data":"d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.753898 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d34b4b01f58d4efb52d58e25e1b6d67170cdffcd12cc39dacc5cc3b16536ff67" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.754465 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nbxdg"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.754888 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ftfwq" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" probeResult="failure" output="command timed out" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.761351 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.765603 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"836bba81-425e-4610-b191-2bbb2cfc1f79","Type":"ContainerDied","Data":"c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.765651 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c643cb3d9cdd9f0ff6dee778f7df8c92dfc49bf8be8b12f994f8471ca51b5517" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.765824 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.768439 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e9e481-0762-42a8-a25a-7d50500f1236-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.768488 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.768507 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.768518 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.768533 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.769921 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nbxdg"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.776971 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.782542 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican4aa4-account-delete-pvmbl" event={"ID":"d42411e0-2228-4a1a-9d31-e3788f2b1f0c","Type":"ContainerDied","Data":"09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.782600 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ffbce5ff992ce45a6588d7df661406a36c03879603fc1bd1229eece009e3bf" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.788664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d1cb194-5325-40c2-bbd4-0a48821e12aa","Type":"ContainerDied","Data":"698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.788717 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="698ab9cd60a9ca0d4905b9578ec18327a647140523bec61f8d2e460409a34dcb" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.791312 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.799472 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance5583-account-delete-xm6hg"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.799874 4895 generic.go:334] "Generic (PLEG): container finished" podID="ca98cba7-4127-4d25-a139-1a42224331f2" containerID="5d044ff799057808b8d67f79590923f9bd83b515bcd050be4b95fa7aeeb31f38" exitCode=0 Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.799929 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerDied","Data":"5d044ff799057808b8d67f79590923f9bd83b515bcd050be4b95fa7aeeb31f38"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.806082 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d08915b6-6f79-40e4-8c26-d9f82606b4cc","Type":"ContainerDied","Data":"3e2e331854c4e9337578d89d34663f953235d7bc3b4d554ef471426d1bd82237"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.806127 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2e331854c4e9337578d89d34663f953235d7bc3b4d554ef471426d1bd82237" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.812803 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance5583-account-delete-xm6hg"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.816558 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.819838 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54957dcd96-7sx87" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.843933 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ddf8948cc-h2bbh" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.844704 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.844993 4895 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi23cb-account-delete-g8msv" secret="" err="secret \"galera-openstack-dockercfg-jskc8\" not found" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.845059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5a3b-account-delete-949mv" event={"ID":"5cae5c9e-9159-4e78-9809-1801d0e35131","Type":"ContainerDied","Data":"1ffc4dad2b26cfe1658af060ac03b29a5bd8150a1c2d6396811081f31b4196d0"} Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.845095 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffc4dad2b26cfe1658af060ac03b29a5bd8150a1c2d6396811081f31b4196d0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.845276 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.846780 4895 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0b7d1-account-delete-wchwk" secret="" err="secret \"galera-openstack-dockercfg-jskc8\" not found" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.892483 4895 scope.go:117] "RemoveContainer" containerID="915e5d2b5f5c95e83c1104dc0136dd664c02203a632c18741317fd352c1f6413" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.892712 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ftfwq" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" probeResult="failure" output=< Dec 02 07:48:55 crc kubenswrapper[4895]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 02 07:48:55 crc kubenswrapper[4895]: > Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.896039 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.896282 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.897574 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.898471 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.908801 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.908998 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5583-account-create-update-vwhn6"] Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.919400 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.932314 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:55 crc kubenswrapper[4895]: I1202 07:48:55.944238 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5583-account-create-update-vwhn6"] Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.016270 4895 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 02 07:48:56 crc kubenswrapper[4895]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-02T07:48:47Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 02 07:48:56 crc kubenswrapper[4895]: /etc/init.d/functions: line 589: 428 Alarm clock "$@" Dec 02 07:48:56 crc kubenswrapper[4895]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-ftfwq" message=< Dec 02 07:48:56 crc kubenswrapper[4895]: Exiting ovn-controller (1) [FAILED] Dec 02 07:48:56 crc kubenswrapper[4895]: Killing ovn-controller (1) [ OK ] Dec 02 07:48:56 crc kubenswrapper[4895]: Killing ovn-controller (1) with SIGKILL [ OK ] Dec 02 07:48:56 crc kubenswrapper[4895]: 2025-12-02T07:48:47Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 02 07:48:56 crc kubenswrapper[4895]: /etc/init.d/functions: line 589: 428 Alarm clock "$@" Dec 02 07:48:56 crc kubenswrapper[4895]: > Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.016321 4895 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 02 07:48:56 crc kubenswrapper[4895]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-02T07:48:47Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 02 07:48:56 crc kubenswrapper[4895]: /etc/init.d/functions: line 589: 428 Alarm clock "$@" Dec 02 07:48:56 crc kubenswrapper[4895]: > pod="openstack/ovn-controller-ftfwq" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" containerID="cri-o://71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.016372 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ftfwq" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" containerID="cri-o://71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830" gracePeriod=21 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.017483 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sfwtc"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.020840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp5fb\" (UniqueName: \"kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.020879 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts\") pod \"5cae5c9e-9159-4e78-9809-1801d0e35131\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.020908 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.020958 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.020981 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.021009 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.021034 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.021069 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.025892 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.025942 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle\") pod \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.025974 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjhg9\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026002 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026028 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnnfp\" (UniqueName: \"kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp\") pod \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026051 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026299 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts\") pod \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026347 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs\") pod \"b15097a8-ac9a-4886-a839-272b662561c5\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026423 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026451 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config\") pod \"b15097a8-ac9a-4886-a839-272b662561c5\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026518 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026554 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026583 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026631 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2fm8\" (UniqueName: \"kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8\") pod \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\" (UID: \"d42411e0-2228-4a1a-9d31-e3788f2b1f0c\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026785 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp9jt\" (UniqueName: \"kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt\") pod \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026811 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle\") pod \"b15097a8-ac9a-4886-a839-272b662561c5\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026844 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts\") pod \"7067a12f-0245-45f5-a806-591d5999c7f0\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026876 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data\") pod \"b15097a8-ac9a-4886-a839-272b662561c5\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026904 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026944 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65\") pod \"7067a12f-0245-45f5-a806-591d5999c7f0\" (UID: \"7067a12f-0245-45f5-a806-591d5999c7f0\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.026979 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027021 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027089 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5\") pod \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027152 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027282 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027316 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcmp2\" (UniqueName: \"kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2\") pod \"b15097a8-ac9a-4886-a839-272b662561c5\" (UID: \"b15097a8-ac9a-4886-a839-272b662561c5\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027376 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts\") pod \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\" (UID: \"f28e5fd3-456b-4960-a3a9-1134e3eecb1f\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data\") pod \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\" (UID: \"d08915b6-6f79-40e4-8c26-d9f82606b4cc\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027463 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxxmx\" (UniqueName: \"kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx\") pod \"5cae5c9e-9159-4e78-9809-1801d0e35131\" (UID: \"5cae5c9e-9159-4e78-9809-1801d0e35131\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027501 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf\") pod \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\" (UID: \"0d1cb194-5325-40c2-bbd4-0a48821e12aa\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs\") pod \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027596 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle\") pod \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027623 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config\") pod \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\" (UID: \"0a3ec758-e19e-4286-bfed-a1d6d3010bfb\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027641 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data\") pod \"836bba81-425e-4610-b191-2bbb2cfc1f79\" (UID: \"836bba81-425e-4610-b191-2bbb2cfc1f79\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027662 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.027681 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r2n\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n\") pod \"ca98cba7-4127-4d25-a139-1a42224331f2\" (UID: \"ca98cba7-4127-4d25-a139-1a42224331f2\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.029939 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.030375 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data" (OuterVolumeSpecName: "config-data") pod "b15097a8-ac9a-4886-a839-272b662561c5" (UID: "b15097a8-ac9a-4886-a839-272b662561c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.031421 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.033016 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b15097a8-ac9a-4886-a839-272b662561c5" (UID: "b15097a8-ac9a-4886-a839-272b662561c5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.034027 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.034902 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cae5c9e-9159-4e78-9809-1801d0e35131" (UID: "5cae5c9e-9159-4e78-9809-1801d0e35131"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.036803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.039830 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.042222 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.044702 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f28e5fd3-456b-4960-a3a9-1134e3eecb1f" (UID: "f28e5fd3-456b-4960-a3a9-1134e3eecb1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.046610 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.057443 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt" (OuterVolumeSpecName: "kube-api-access-cp9jt") pod "d08915b6-6f79-40e4-8c26-d9f82606b4cc" (UID: "d08915b6-6f79-40e4-8c26-d9f82606b4cc"). InnerVolumeSpecName "kube-api-access-cp9jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.058634 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config" (OuterVolumeSpecName: "config") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.058863 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts" (OuterVolumeSpecName: "scripts") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.059880 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.059925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n" (OuterVolumeSpecName: "kube-api-access-m9r2n") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "kube-api-access-m9r2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.060662 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.060683 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d42411e0-2228-4a1a-9d31-e3788f2b1f0c" (UID: "d42411e0-2228-4a1a-9d31-e3788f2b1f0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.060845 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65" (OuterVolumeSpecName: "kube-api-access-gvg65") pod "7067a12f-0245-45f5-a806-591d5999c7f0" (UID: "7067a12f-0245-45f5-a806-591d5999c7f0"). InnerVolumeSpecName "kube-api-access-gvg65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.065400 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7067a12f-0245-45f5-a806-591d5999c7f0" (UID: "7067a12f-0245-45f5-a806-591d5999c7f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.068787 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sfwtc"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.071913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.073091 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb" (OuterVolumeSpecName: "kube-api-access-xp5fb") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "kube-api-access-xp5fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.080786 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.081982 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.082638 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info" (OuterVolumeSpecName: "pod-info") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.083947 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info" (OuterVolumeSpecName: "pod-info") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.089779 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp" (OuterVolumeSpecName: "kube-api-access-bnnfp") pod "0a3ec758-e19e-4286-bfed-a1d6d3010bfb" (UID: "0a3ec758-e19e-4286-bfed-a1d6d3010bfb"). InnerVolumeSpecName "kube-api-access-bnnfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.093942 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.098052 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9" (OuterVolumeSpecName: "kube-api-access-mjhg9") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "kube-api-access-mjhg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.102231 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.102306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.102381 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5" (OuterVolumeSpecName: "kube-api-access-sjzj5") pod "f28e5fd3-456b-4960-a3a9-1134e3eecb1f" (UID: "f28e5fd3-456b-4960-a3a9-1134e3eecb1f"). InnerVolumeSpecName "kube-api-access-sjzj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.102443 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx" (OuterVolumeSpecName: "kube-api-access-sxxmx") pod "5cae5c9e-9159-4e78-9809-1801d0e35131" (UID: "5cae5c9e-9159-4e78-9809-1801d0e35131"). InnerVolumeSpecName "kube-api-access-sxxmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.103144 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8" (OuterVolumeSpecName: "kube-api-access-s2fm8") pod "d42411e0-2228-4a1a-9d31-e3788f2b1f0c" (UID: "d42411e0-2228-4a1a-9d31-e3788f2b1f0c"). InnerVolumeSpecName "kube-api-access-s2fm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.106552 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2" (OuterVolumeSpecName: "kube-api-access-zcmp2") pod "b15097a8-ac9a-4886-a839-272b662561c5" (UID: "b15097a8-ac9a-4886-a839-272b662561c5"). InnerVolumeSpecName "kube-api-access-zcmp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.106668 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.132940 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp5fb\" (UniqueName: \"kubernetes.io/projected/836bba81-425e-4610-b191-2bbb2cfc1f79-kube-api-access-xp5fb\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.132974 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cae5c9e-9159-4e78-9809-1801d0e35131-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.132985 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d1cb194-5325-40c2-bbd4-0a48821e12aa-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133008 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133019 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ca98cba7-4127-4d25-a139-1a42224331f2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133033 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d1cb194-5325-40c2-bbd4-0a48821e12aa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133047 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjhg9\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-kube-api-access-mjhg9\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133058 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnnfp\" (UniqueName: \"kubernetes.io/projected/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-api-access-bnnfp\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133067 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133076 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133088 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133099 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133110 4895 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133120 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ca98cba7-4127-4d25-a139-1a42224331f2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133136 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133148 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133157 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133166 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2fm8\" (UniqueName: \"kubernetes.io/projected/d42411e0-2228-4a1a-9d31-e3788f2b1f0c-kube-api-access-s2fm8\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133174 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp9jt\" (UniqueName: \"kubernetes.io/projected/d08915b6-6f79-40e4-8c26-d9f82606b4cc-kube-api-access-cp9jt\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133183 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7067a12f-0245-45f5-a806-591d5999c7f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133193 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b15097a8-ac9a-4886-a839-272b662561c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133202 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133210 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133219 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvg65\" (UniqueName: \"kubernetes.io/projected/7067a12f-0245-45f5-a806-591d5999c7f0-kube-api-access-gvg65\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133228 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/836bba81-425e-4610-b191-2bbb2cfc1f79-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133240 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzj5\" (UniqueName: \"kubernetes.io/projected/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-kube-api-access-sjzj5\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133248 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133257 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133266 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133276 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcmp2\" (UniqueName: \"kubernetes.io/projected/b15097a8-ac9a-4886-a839-272b662561c5-kube-api-access-zcmp2\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133289 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f28e5fd3-456b-4960-a3a9-1134e3eecb1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133299 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133307 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxxmx\" (UniqueName: \"kubernetes.io/projected/5cae5c9e-9159-4e78-9809-1801d0e35131-kube-api-access-sxxmx\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133316 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r2n\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-kube-api-access-m9r2n\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.133326 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.134797 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.134877 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:48:58.134849971 +0000 UTC m=+1549.305709584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.160177 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.200603 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.225131 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.225147 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data" (OuterVolumeSpecName: "config-data") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.239556 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.239589 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.254259 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4aa4-account-create-update-t4vvh"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.257196 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.269510 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.278043 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.278127 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.292074 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4aa4-account-create-update-t4vvh"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.344024 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.344584 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" (UID: "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.350261 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.350305 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.350387 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.350478 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:58.350455069 +0000 UTC m=+1549.521314682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.352524 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d08915b6-6f79-40e4-8c26-d9f82606b4cc" (UID: "d08915b6-6f79-40e4-8c26-d9f82606b4cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.358517 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "290c1303-bf41-4474-86ff-c9f5aa105cc3" (UID: "290c1303-bf41-4474-86ff-c9f5aa105cc3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.365487 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.366446 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.374849 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.383901 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.389734 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.409871 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-788d454954-brr26"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.454617 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.455132 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.455191 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/290c1303-bf41-4474-86ff-c9f5aa105cc3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.465095 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tlgbq"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.473813 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tlgbq"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.479904 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.499211 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-23cb-account-create-update-7svkf"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.504124 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f762a68c-cabc-4842-844a-1db6710e3ee9" (UID: "f762a68c-cabc-4842-844a-1db6710e3ee9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.509938 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.515858 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" (UID: "2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.518213 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-23cb-account-create-update-7svkf"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.519442 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-56dbdc9bc-kgkw2" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.152:5000/v3\": read tcp 10.217.0.2:46642->10.217.0.152:5000: read: connection reset by peer" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.556947 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.556999 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.557020 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f762a68c-cabc-4842-844a-1db6710e3ee9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.560821 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zl6qf"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.572005 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zl6qf"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.580774 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data" (OuterVolumeSpecName: "config-data") pod "d08915b6-6f79-40e4-8c26-d9f82606b4cc" (UID: "d08915b6-6f79-40e4-8c26-d9f82606b4cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.583065 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.585106 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.596682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0a3ec758-e19e-4286-bfed-a1d6d3010bfb" (UID: "0a3ec758-e19e-4286-bfed-a1d6d3010bfb"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.599150 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b7d1-account-create-update-sqm7t"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.611900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a3ec758-e19e-4286-bfed-a1d6d3010bfb" (UID: "0a3ec758-e19e-4286-bfed-a1d6d3010bfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.643102 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b7d1-account-create-update-sqm7t"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.659200 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf" (OuterVolumeSpecName: "server-conf") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.660166 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.660194 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d08915b6-6f79-40e4-8c26-d9f82606b4cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.660209 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.660219 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.660227 4895 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.699118 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b15097a8-ac9a-4886-a839-272b662561c5" (UID: "b15097a8-ac9a-4886-a839-272b662561c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.700164 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf" (OuterVolumeSpecName: "server-conf") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.709570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ab5ec753-410a-4d4b-8071-ce60970ba4df" (UID: "ab5ec753-410a-4d4b-8071-ce60970ba4df"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.717129 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.727615 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6f6974886f-mmsbz"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.765673 4895 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab5ec753-410a-4d4b-8071-ce60970ba4df-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.765708 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.765722 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.770693 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data" (OuterVolumeSpecName: "config-data") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.771282 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.774515 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ca98cba7-4127-4d25-a139-1a42224331f2" (UID: "ca98cba7-4127-4d25-a139-1a42224331f2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.818643 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "b15097a8-ac9a-4886-a839-272b662561c5" (UID: "b15097a8-ac9a-4886-a839-272b662561c5"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.836454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.860732 4895 generic.go:334] "Generic (PLEG): container finished" podID="31223325-1372-4ea6-867e-f511b7dffc09" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.860950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31223325-1372-4ea6-867e-f511b7dffc09","Type":"ContainerDied","Data":"45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.863094 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" (UID: "68bddf66-0b9f-4bc8-916b-aa0abfbf13c3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.868257 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ca98cba7-4127-4d25-a139-1a42224331f2","Type":"ContainerDied","Data":"15a095a70eb867e75188aa85a8bc8725e7974c8213e3fee9e754f1ad56e47533"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.868371 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.868438 4895 scope.go:117] "RemoveContainer" containerID="5d044ff799057808b8d67f79590923f9bd83b515bcd050be4b95fa7aeeb31f38" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.868522 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.868905 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") pod \"keystonea3d7-account-delete-jgvsf\" (UID: \"ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9\") " pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.872817 4895 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b15097a8-ac9a-4886-a839-272b662561c5-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.873308 4895 projected.go:194] Error preparing data for projected volume kube-api-access-8zdjw for pod openstack/keystonea3d7-account-delete-jgvsf: failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.873604 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.873650 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:00.873385784 +0000 UTC m=+1552.044245397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8zdjw" (UniqueName: "kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : failed to fetch token: serviceaccounts "galera-openstack" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.873669 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca98cba7-4127-4d25-a139-1a42224331f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.873672 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.873683 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d1cb194-5325-40c2-bbd4-0a48821e12aa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.873702 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ca98cba7-4127-4d25-a139-1a42224331f2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: E1202 07:48:56.874260 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts podName:ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:00.874247651 +0000 UTC m=+1552.045107264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts") pod "keystonea3d7-account-delete-jgvsf" (UID: "ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9") : configmap "openstack-scripts" not found Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.874280 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.879008 4895 generic.go:334] "Generic (PLEG): container finished" podID="247c892c-e00a-474e-8022-73bd1b2249f3" containerID="fe38dca9f6627e9e19b2be20b54cb47cb1aee5e491dae454c261bcbe08243752" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.879095 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56dbdc9bc-kgkw2" event={"ID":"247c892c-e00a-474e-8022-73bd1b2249f3","Type":"ContainerDied","Data":"fe38dca9f6627e9e19b2be20b54cb47cb1aee5e491dae454c261bcbe08243752"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.881611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data" (OuterVolumeSpecName: "config-data") pod "0d1cb194-5325-40c2-bbd4-0a48821e12aa" (UID: "0d1cb194-5325-40c2-bbd4-0a48821e12aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.891505 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ftfwq_84116ead-6214-4d5f-98a3-c89b08cf1306/ovn-controller/0.log" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.891574 4895 generic.go:334] "Generic (PLEG): container finished" podID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerID="71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830" exitCode=137 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.891721 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq" event={"ID":"84116ead-6214-4d5f-98a3-c89b08cf1306","Type":"ContainerDied","Data":"71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.891779 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ftfwq" event={"ID":"84116ead-6214-4d5f-98a3-c89b08cf1306","Type":"ContainerDied","Data":"a9282f39595827e587c997c71e195d6fcced31850b4abd4a89e96be2134beb38"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.891794 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9282f39595827e587c997c71e195d6fcced31850b4abd4a89e96be2134beb38" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.898013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0a3ec758-e19e-4286-bfed-a1d6d3010bfb" (UID: "0a3ec758-e19e-4286-bfed-a1d6d3010bfb"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.910209 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data" (OuterVolumeSpecName: "config-data") pod "836bba81-425e-4610-b191-2bbb2cfc1f79" (UID: "836bba81-425e-4610-b191-2bbb2cfc1f79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.915710 4895 generic.go:334] "Generic (PLEG): container finished" podID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerID="9b1129b02fb76389880616b0a4f07ba64c625c630960d277f41251bdc884c35b" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.915785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerDied","Data":"9b1129b02fb76389880616b0a4f07ba64c625c630960d277f41251bdc884c35b"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.915871 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4b0ee49-bed2-4691-8160-2edbebda27b7","Type":"ContainerDied","Data":"cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.915903 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd37e231d02f94c7bdd1c0857ed7634070474732b2fd6cf5880fe272e52a3cfc" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.916548 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923353 4895 generic.go:334] "Generic (PLEG): container finished" podID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerID="6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923566 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0b7d1-account-delete-wchwk" podUID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" containerName="mariadb-account-delete" containerID="cri-o://0c2c388094b95cef4d9070468d30cc3bb7a5071f95547b2ee0b18119aa7ce3f9" gracePeriod=30 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923704 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder7d85-account-delete-j8sgc" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923727 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonea3d7-account-delete-jgvsf" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerDied","Data":"6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704"} Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923924 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924198 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5a3b-account-delete-949mv" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.923907 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi23cb-account-delete-g8msv" podUID="96831697-ba2e-477e-954f-e4ad0cf30f92" containerName="mariadb-account-delete" containerID="cri-o://0ea2b37615e5717b134e70582d30afd9a8248506c11d128a320a1ec2c2f21f39" gracePeriod=30 Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924380 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican4aa4-account-delete-pvmbl" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924425 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924474 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924513 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924563 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.924604 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementa8bc-account-delete-jz5nc" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.955056 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ftfwq_84116ead-6214-4d5f-98a3-c89b08cf1306/ovn-controller/0.log" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.955144 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.955721 4895 scope.go:117] "RemoveContainer" containerID="dae6ee95ef6df69cc075b37be3c7109ee4cab3f60c4bdf61b7793f530ffc9ab5" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978664 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978785 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978833 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbsnz\" (UniqueName: \"kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978877 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978900 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978937 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.978967 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.979012 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd\") pod \"f4b0ee49-bed2-4691-8160-2edbebda27b7\" (UID: \"f4b0ee49-bed2-4691-8160-2edbebda27b7\") " Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.980197 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d1cb194-5325-40c2-bbd4-0a48821e12aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.980220 4895 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a3ec758-e19e-4286-bfed-a1d6d3010bfb-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.980234 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836bba81-425e-4610-b191-2bbb2cfc1f79-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.980638 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.980883 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.984711 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.987561 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz" (OuterVolumeSpecName: "kube-api-access-mbsnz") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "kube-api-access-mbsnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.991206 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 07:48:56 crc kubenswrapper[4895]: I1202 07:48:56.992140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts" (OuterVolumeSpecName: "scripts") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.037553 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.037643 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.062503 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091587 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091647 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8fl7\" (UniqueName: \"kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091686 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091847 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091944 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.091980 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs\") pod \"84116ead-6214-4d5f-98a3-c89b08cf1306\" (UID: \"84116ead-6214-4d5f-98a3-c89b08cf1306\") " Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.092345 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.092357 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbsnz\" (UniqueName: \"kubernetes.io/projected/f4b0ee49-bed2-4691-8160-2edbebda27b7-kube-api-access-mbsnz\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.092367 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.092375 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4b0ee49-bed2-4691-8160-2edbebda27b7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.092385 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.098932 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.099009 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run" (OuterVolumeSpecName: "var-run") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.099334 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.100618 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts" (OuterVolumeSpecName: "scripts") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.152085 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7" (OuterVolumeSpecName: "kube-api-access-c8fl7") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "kube-api-access-c8fl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.200481 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.200520 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8fl7\" (UniqueName: \"kubernetes.io/projected/84116ead-6214-4d5f-98a3-c89b08cf1306-kube-api-access-c8fl7\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.200532 4895 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.200541 4895 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84116ead-6214-4d5f-98a3-c89b08cf1306-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.200552 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84116ead-6214-4d5f-98a3-c89b08cf1306-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.232829 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.238989 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.272930 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704 is running failed: container process not found" containerID="6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.281679 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704 is running failed: container process not found" containerID="6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.282368 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704 is running failed: container process not found" containerID="6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.282457 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="galera" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.302642 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.302681 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.343620 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data" (OuterVolumeSpecName: "config-data") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.368774 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137d4f28-0e97-4154-90f1-22426094ef5e" path="/var/lib/kubelet/pods/137d4f28-0e97-4154-90f1-22426094ef5e/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.369180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "84116ead-6214-4d5f-98a3-c89b08cf1306" (UID: "84116ead-6214-4d5f-98a3-c89b08cf1306"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.370779 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" path="/var/lib/kubelet/pods/183c5216-30f9-4f75-865b-7f795ea149fb/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.371367 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" path="/var/lib/kubelet/pods/2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.372012 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f5b5b1-5555-42cd-b212-71f5e6c5d0c3" path="/var/lib/kubelet/pods/41f5b5b1-5555-42cd-b212-71f5e6c5d0c3/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.373514 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d2c26b-ed57-435f-845e-e4d51a4d9aa3" path="/var/lib/kubelet/pods/66d2c26b-ed57-435f-845e-e4d51a4d9aa3/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.374383 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68666f08-2df0-4f46-a22c-9f33cfb65732" path="/var/lib/kubelet/pods/68666f08-2df0-4f46-a22c-9f33cfb65732/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.375227 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3c2445-8bce-4d09-ad86-02c1ba6495fb" path="/var/lib/kubelet/pods/6b3c2445-8bce-4d09-ad86-02c1ba6495fb/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.399836 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4b0ee49-bed2-4691-8160-2edbebda27b7" (UID: "f4b0ee49-bed2-4691-8160-2edbebda27b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.404238 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/84116ead-6214-4d5f-98a3-c89b08cf1306-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.404282 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.404296 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b0ee49-bed2-4691-8160-2edbebda27b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.409568 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddddc35-bd7e-4d40-804f-aa2193b6cd16" path="/var/lib/kubelet/pods/7ddddc35-bd7e-4d40-804f-aa2193b6cd16/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.410210 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" path="/var/lib/kubelet/pods/85e9e481-0762-42a8-a25a-7d50500f1236/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.410796 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915ca98c-6878-4b7a-ba75-75ab97ce5900" path="/var/lib/kubelet/pods/915ca98c-6878-4b7a-ba75-75ab97ce5900/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.433158 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a98f117-14fc-47c5-9106-c9a3daf161f8" path="/var/lib/kubelet/pods/9a98f117-14fc-47c5-9106-c9a3daf161f8/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.433756 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f38520b-350c-4c3c-9bd2-48bf3c492299" path="/var/lib/kubelet/pods/9f38520b-350c-4c3c-9bd2-48bf3c492299/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.434344 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd" path="/var/lib/kubelet/pods/ad4346a5-f2f0-4809-b1d3-0c9a70b51cbd/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.435964 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c880aad7-a43f-45d7-b7cc-b9252d06eadf" path="/var/lib/kubelet/pods/c880aad7-a43f-45d7-b7cc-b9252d06eadf/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.436686 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4ea5bf-abb5-4fc6-887a-46f19eee6493" path="/var/lib/kubelet/pods/cb4ea5bf-abb5-4fc6-887a-46f19eee6493/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.437301 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" path="/var/lib/kubelet/pods/e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.438553 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" path="/var/lib/kubelet/pods/e4869eb0-5e33-4837-8295-06ca17076e69/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.439670 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea719270-c425-4d8b-8717-6c47a5556302" path="/var/lib/kubelet/pods/ea719270-c425-4d8b-8717-6c47a5556302/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.440432 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" path="/var/lib/kubelet/pods/ebbed0ba-1d44-4421-a276-b075b0f31c3f/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.441658 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" path="/var/lib/kubelet/pods/f762a68c-cabc-4842-844a-1db6710e3ee9/volumes" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444641 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444671 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444687 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444702 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ddf8948cc-h2bbh"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444714 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.444730 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.677329 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6 is running failed: container process not found" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.679118 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6 is running failed: container process not found" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.679891 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6 is running failed: container process not found" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.679942 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" Dec 02 07:48:57 crc kubenswrapper[4895]: E1202 07:48:57.854475 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod446b5a26_8e57_4765_bb7d_275cf05996dd.slice/crio-9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod446b5a26_8e57_4765_bb7d_275cf05996dd.slice/crio-conmon-9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a02963_abb5_4f29_aa82_88ba6f859a00.slice/crio-conmon-e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/system-systemd\\\\x2dcoredump.slice/systemd-coredump@0-73583-0.service\": RecentStats: unable to find data in memory cache]" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.930425 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.965228 4895 generic.go:334] "Generic (PLEG): container finished" podID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" exitCode=0 Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.965343 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65a02963-abb5-4f29-aa82-88ba6f859a00","Type":"ContainerDied","Data":"e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd"} Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.969479 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerID="4dc420dbf673d00f97311d57a4404d16d8b6c032b5b89c29e8505019899d42c9" exitCode=0 Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.969610 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerDied","Data":"4dc420dbf673d00f97311d57a4404d16d8b6c032b5b89c29e8505019899d42c9"} Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.973310 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"38385316-fca8-41b0-b0ff-570a9cd71e8a","Type":"ContainerDied","Data":"0be7fbe507321827b307a582000ca34981fa4418347f3ceed7cd877618c413d3"} Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.973382 4895 scope.go:117] "RemoveContainer" containerID="6aae636a2f05d49cb09841be65cf88064cfd592fbc7ebcfc8e0589c5f285e704" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.973533 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.983545 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"31223325-1372-4ea6-867e-f511b7dffc09","Type":"ContainerDied","Data":"9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311"} Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.983615 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d571a69d16683c5710f79ff507e78a1b7e707cb6234248927b8f7b39697f311" Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.997303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56dbdc9bc-kgkw2" event={"ID":"247c892c-e00a-474e-8022-73bd1b2249f3","Type":"ContainerDied","Data":"278224f012e51a3c3f0f8cebf3193397ed3b386576713370432fad71826ecd5f"} Dec 02 07:48:57 crc kubenswrapper[4895]: I1202 07:48:57.997362 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278224f012e51a3c3f0f8cebf3193397ed3b386576713370432fad71826ecd5f" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030179 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030257 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030292 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030352 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030384 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.030399 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.031612 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5z9q\" (UniqueName: \"kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q\") pod \"38385316-fca8-41b0-b0ff-570a9cd71e8a\" (UID: \"38385316-fca8-41b0-b0ff-570a9cd71e8a\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.038364 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_446b5a26-8e57-4765-bb7d-275cf05996dd/ovn-northd/0.log" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.038413 4895 generic.go:334] "Generic (PLEG): container finished" podID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" exitCode=139 Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.038534 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ftfwq" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.043300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerDied","Data":"9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58"} Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.038364 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.044850 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.047003 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.047473 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.048024 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.055444 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.056001 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q" (OuterVolumeSpecName: "kube-api-access-w5z9q") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "kube-api-access-w5z9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.058261 4895 scope.go:117] "RemoveContainer" containerID="70b9062cebf51e2cb33bacac0c59956df85ea562e2a492f0a1c5926a2d8af62e" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.060664 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.061131 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.072489 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.083367 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonea3d7-account-delete-jgvsf"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.094679 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.108152 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystonea3d7-account-delete-jgvsf"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.116903 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "38385316-fca8-41b0-b0ff-570a9cd71e8a" (UID: "38385316-fca8-41b0-b0ff-570a9cd71e8a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.120872 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_446b5a26-8e57-4765-bb7d-275cf05996dd/ovn-northd/0.log" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.121036 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.132792 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.133683 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data\") pod \"65a02963-abb5-4f29-aa82-88ba6f859a00\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.133972 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data\") pod \"31223325-1372-4ea6-867e-f511b7dffc09\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134035 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134071 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134183 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj5fq\" (UniqueName: \"kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq\") pod \"31223325-1372-4ea6-867e-f511b7dffc09\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134333 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle\") pod \"31223325-1372-4ea6-867e-f511b7dffc09\" (UID: \"31223325-1372-4ea6-867e-f511b7dffc09\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134370 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134392 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134429 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle\") pod \"65a02963-abb5-4f29-aa82-88ba6f859a00\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134454 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134479 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134507 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134544 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134568 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87kr4\" (UniqueName: \"kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4\") pod \"65a02963-abb5-4f29-aa82-88ba6f859a00\" (UID: \"65a02963-abb5-4f29-aa82-88ba6f859a00\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134593 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxf2\" (UniqueName: \"kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134620 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns8ww\" (UniqueName: \"kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134717 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts\") pod \"446b5a26-8e57-4765-bb7d-275cf05996dd\" (UID: \"446b5a26-8e57-4765-bb7d-275cf05996dd\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.134753 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys\") pod \"247c892c-e00a-474e-8022-73bd1b2249f3\" (UID: \"247c892c-e00a-474e-8022-73bd1b2249f3\") " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135260 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135290 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135303 4895 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135316 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5z9q\" (UniqueName: \"kubernetes.io/projected/38385316-fca8-41b0-b0ff-570a9cd71e8a-kube-api-access-w5z9q\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135353 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135382 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135393 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zdjw\" (UniqueName: \"kubernetes.io/projected/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9-kube-api-access-8zdjw\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135402 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/38385316-fca8-41b0-b0ff-570a9cd71e8a-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135411 4895 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.135421 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38385316-fca8-41b0-b0ff-570a9cd71e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.140213 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.140775 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config" (OuterVolumeSpecName: "config") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.142920 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts" (OuterVolumeSpecName: "scripts") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: E1202 07:48:58.149207 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:58 crc kubenswrapper[4895]: E1202 07:48:58.149496 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:49:02.14929005 +0000 UTC m=+1553.320149663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.153293 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder7d85-account-delete-j8sgc"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.164577 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.166397 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq" (OuterVolumeSpecName: "kube-api-access-kj5fq") pod "31223325-1372-4ea6-867e-f511b7dffc09" (UID: "31223325-1372-4ea6-867e-f511b7dffc09"). InnerVolumeSpecName "kube-api-access-kj5fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.171012 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.175869 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww" (OuterVolumeSpecName: "kube-api-access-ns8ww") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "kube-api-access-ns8ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.188855 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.190340 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts" (OuterVolumeSpecName: "scripts") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.190467 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4" (OuterVolumeSpecName: "kube-api-access-87kr4") pod "65a02963-abb5-4f29-aa82-88ba6f859a00" (UID: "65a02963-abb5-4f29-aa82-88ba6f859a00"). InnerVolumeSpecName "kube-api-access-87kr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.190521 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2" (OuterVolumeSpecName: "kube-api-access-pzxf2") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "kube-api-access-pzxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.191022 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.219489 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.222057 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican4aa4-account-delete-pvmbl"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.234097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data" (OuterVolumeSpecName: "config-data") pod "65a02963-abb5-4f29-aa82-88ba6f859a00" (UID: "65a02963-abb5-4f29-aa82-88ba6f859a00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.246613 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247119 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247193 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj5fq\" (UniqueName: \"kubernetes.io/projected/31223325-1372-4ea6-867e-f511b7dffc09-kube-api-access-kj5fq\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247250 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247319 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247421 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247519 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247718 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247812 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87kr4\" (UniqueName: \"kubernetes.io/projected/65a02963-abb5-4f29-aa82-88ba6f859a00-kube-api-access-87kr4\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247882 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxf2\" (UniqueName: \"kubernetes.io/projected/247c892c-e00a-474e-8022-73bd1b2249f3-kube-api-access-pzxf2\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.247952 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns8ww\" (UniqueName: \"kubernetes.io/projected/446b5a26-8e57-4765-bb7d-275cf05996dd-kube-api-access-ns8ww\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.248030 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/446b5a26-8e57-4765-bb7d-275cf05996dd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.248099 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.246887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data" (OuterVolumeSpecName: "config-data") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.256218 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.267479 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.282155 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.283265 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.299210 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.308603 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31223325-1372-4ea6-867e-f511b7dffc09" (UID: "31223325-1372-4ea6-867e-f511b7dffc09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.311100 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.315412 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.320319 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron5a3b-account-delete-949mv"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.338118 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.338758 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "247c892c-e00a-474e-8022-73bd1b2249f3" (UID: "247c892c-e00a-474e-8022-73bd1b2249f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.349845 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.350025 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.350091 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.350153 4895 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.350213 4895 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c892c-e00a-474e-8022-73bd1b2249f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.350280 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.364524 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.366423 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "446b5a26-8e57-4765-bb7d-275cf05996dd" (UID: "446b5a26-8e57-4765-bb7d-275cf05996dd"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.372525 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.382243 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ftfwq"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.390275 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.394986 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.398535 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a02963-abb5-4f29-aa82-88ba6f859a00" (UID: "65a02963-abb5-4f29-aa82-88ba6f859a00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.402907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data" (OuterVolumeSpecName: "config-data") pod "31223325-1372-4ea6-867e-f511b7dffc09" (UID: "31223325-1372-4ea6-867e-f511b7dffc09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.424201 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.436609 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.446798 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.451834 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a02963-abb5-4f29-aa82-88ba6f859a00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.451868 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: E1202 07:48:58.451961 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.452030 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31223325-1372-4ea6-867e-f511b7dffc09-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: E1202 07:48:58.452131 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:02.452102129 +0000 UTC m=+1553.622961952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.452585 4895 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5a26-8e57-4765-bb7d-275cf05996dd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.452642 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.475577 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.490753 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementa8bc-account-delete-jz5nc"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.497661 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.503822 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-54957dcd96-7sx87"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.510836 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:48:58 crc kubenswrapper[4895]: I1202 07:48:58.517291 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.066916 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_446b5a26-8e57-4765-bb7d-275cf05996dd/ovn-northd/0.log" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.067455 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"446b5a26-8e57-4765-bb7d-275cf05996dd","Type":"ContainerDied","Data":"fcab07b3a6e3e24e623ff43fafd5d6c39c2f1c8eea31976b8d55fe2707cec11a"} Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.067505 4895 scope.go:117] "RemoveContainer" containerID="8f7f80f7975fea79b1c3bcefa5a8a41052d690e193ab88673538d60ad2720b9a" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.067677 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.077167 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.077193 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"65a02963-abb5-4f29-aa82-88ba6f859a00","Type":"ContainerDied","Data":"01c44df798ca616a54cc3d96ce4d25a37d45d50a80f4a2d2b8f986f0c1428c2f"} Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.082046 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerStarted","Data":"867d58af9aa78f6e2b186a556b4dd87418cbe07902e00bd2851b918d8af40dcf"} Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.089524 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56dbdc9bc-kgkw2" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.089521 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.091911 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.092215 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.112099 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wjx7g" podStartSLOduration=8.140589056 podStartE2EDuration="12.112073728s" podCreationTimestamp="2025-12-02 07:48:47 +0000 UTC" firstStartedPulling="2025-12-02 07:48:54.535202688 +0000 UTC m=+1545.706062301" lastFinishedPulling="2025-12-02 07:48:58.50668736 +0000 UTC m=+1549.677546973" observedRunningTime="2025-12-02 07:48:59.108614952 +0000 UTC m=+1550.279474615" watchObservedRunningTime="2025-12-02 07:48:59.112073728 +0000 UTC m=+1550.282933341" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.116834 4895 scope.go:117] "RemoveContainer" containerID="9c18aeb9311a3ffa5790c2f236b884d856db73bab542194f9a4509de984dba58" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.157235 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" path="/var/lib/kubelet/pods/0a3ec758-e19e-4286-bfed-a1d6d3010bfb/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.160682 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" path="/var/lib/kubelet/pods/0d1cb194-5325-40c2-bbd4-0a48821e12aa/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.164248 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" path="/var/lib/kubelet/pods/290c1303-bf41-4474-86ff-c9f5aa105cc3/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.165092 4895 scope.go:117] "RemoveContainer" containerID="e4a7fe9750c9bc6c97a65a057cac01332e8866edaece81d177811b186bff46cd" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.168944 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" path="/var/lib/kubelet/pods/38385316-fca8-41b0-b0ff-570a9cd71e8a/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.170908 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cae5c9e-9159-4e78-9809-1801d0e35131" path="/var/lib/kubelet/pods/5cae5c9e-9159-4e78-9809-1801d0e35131/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.171540 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" path="/var/lib/kubelet/pods/68bddf66-0b9f-4bc8-916b-aa0abfbf13c3/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.172902 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7067a12f-0245-45f5-a806-591d5999c7f0" path="/var/lib/kubelet/pods/7067a12f-0245-45f5-a806-591d5999c7f0/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.173395 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" path="/var/lib/kubelet/pods/836bba81-425e-4610-b191-2bbb2cfc1f79/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.174302 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" path="/var/lib/kubelet/pods/84116ead-6214-4d5f-98a3-c89b08cf1306/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.175450 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" path="/var/lib/kubelet/pods/ab5ec753-410a-4d4b-8071-ce60970ba4df/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.176119 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15097a8-ac9a-4886-a839-272b662561c5" path="/var/lib/kubelet/pods/b15097a8-ac9a-4886-a839-272b662561c5/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.176831 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" path="/var/lib/kubelet/pods/ca98cba7-4127-4d25-a139-1a42224331f2/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.178373 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" path="/var/lib/kubelet/pods/d08915b6-6f79-40e4-8c26-d9f82606b4cc/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.180624 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42411e0-2228-4a1a-9d31-e3788f2b1f0c" path="/var/lib/kubelet/pods/d42411e0-2228-4a1a-9d31-e3788f2b1f0c/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.182300 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f28e5fd3-456b-4960-a3a9-1134e3eecb1f" path="/var/lib/kubelet/pods/f28e5fd3-456b-4960-a3a9-1134e3eecb1f/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.182894 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" path="/var/lib/kubelet/pods/f4b0ee49-bed2-4691-8160-2edbebda27b7/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184333 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9" path="/var/lib/kubelet/pods/ffbc11e8-7e53-46db-bcd7-35b4ab5d7fb9/volumes" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184890 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184924 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184945 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184959 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184971 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.184983 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.208024 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.220129 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-56dbdc9bc-kgkw2"] Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.584976 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:59 crc kubenswrapper[4895]: I1202 07:48:59.588213 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.872194 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.872854 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.872851 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.873409 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.873535 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.874556 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.876572 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:48:59 crc kubenswrapper[4895]: E1202 07:48:59.876624 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:49:00 crc kubenswrapper[4895]: I1202 07:49:00.157459 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-wjx7g" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="registry-server" probeResult="failure" output=< Dec 02 07:49:00 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 07:49:00 crc kubenswrapper[4895]: > Dec 02 07:49:00 crc kubenswrapper[4895]: I1202 07:49:00.429783 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: i/o timeout" Dec 02 07:49:00 crc kubenswrapper[4895]: I1202 07:49:00.749459 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: i/o timeout" Dec 02 07:49:01 crc kubenswrapper[4895]: I1202 07:49:01.161204 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" path="/var/lib/kubelet/pods/247c892c-e00a-474e-8022-73bd1b2249f3/volumes" Dec 02 07:49:01 crc kubenswrapper[4895]: I1202 07:49:01.162457 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31223325-1372-4ea6-867e-f511b7dffc09" path="/var/lib/kubelet/pods/31223325-1372-4ea6-867e-f511b7dffc09/volumes" Dec 02 07:49:01 crc kubenswrapper[4895]: I1202 07:49:01.164293 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" path="/var/lib/kubelet/pods/446b5a26-8e57-4765-bb7d-275cf05996dd/volumes" Dec 02 07:49:01 crc kubenswrapper[4895]: I1202 07:49:01.167124 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" path="/var/lib/kubelet/pods/65a02963-abb5-4f29-aa82-88ba6f859a00/volumes" Dec 02 07:49:02 crc kubenswrapper[4895]: E1202 07:49:02.236421 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:02 crc kubenswrapper[4895]: E1202 07:49:02.236898 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:49:10.236874007 +0000 UTC m=+1561.407733620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:49:02 crc kubenswrapper[4895]: E1202 07:49:02.542097 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:02 crc kubenswrapper[4895]: E1202 07:49:02.542196 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:10.542176522 +0000 UTC m=+1561.713036135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:49:04 crc kubenswrapper[4895]: I1202 07:49:04.180594 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:04 crc kubenswrapper[4895]: I1202 07:49:04.181056 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:04 crc kubenswrapper[4895]: I1202 07:49:04.241262 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.870540 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.871111 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.871799 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.871864 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.872850 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.875419 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.878514 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:04 crc kubenswrapper[4895]: E1202 07:49:04.878610 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.232088 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.293215 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.473988 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.474074 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.474154 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.476294 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:49:05 crc kubenswrapper[4895]: I1202 07:49:05.476492 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" gracePeriod=600 Dec 02 07:49:06 crc kubenswrapper[4895]: E1202 07:49:06.118422 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:49:06 crc kubenswrapper[4895]: I1202 07:49:06.229616 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" exitCode=0 Dec 02 07:49:06 crc kubenswrapper[4895]: I1202 07:49:06.229685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03"} Dec 02 07:49:06 crc kubenswrapper[4895]: I1202 07:49:06.230756 4895 scope.go:117] "RemoveContainer" containerID="a143326e40e351d8dd85edf0fa1f56c57dc56e760d18e0c6ec782a546a0196af" Dec 02 07:49:06 crc kubenswrapper[4895]: I1202 07:49:06.231756 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:49:06 crc kubenswrapper[4895]: E1202 07:49:06.232607 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.244398 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vpwvp" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="registry-server" containerID="cri-o://87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1" gracePeriod=2 Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.744621 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.746892 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities\") pod \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.747052 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content\") pod \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.747115 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2nmk\" (UniqueName: \"kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk\") pod \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\" (UID: \"e203ec5f-dd45-44bb-97b2-fd8a548ce231\") " Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.748334 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities" (OuterVolumeSpecName: "utilities") pod "e203ec5f-dd45-44bb-97b2-fd8a548ce231" (UID: "e203ec5f-dd45-44bb-97b2-fd8a548ce231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.755778 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk" (OuterVolumeSpecName: "kube-api-access-p2nmk") pod "e203ec5f-dd45-44bb-97b2-fd8a548ce231" (UID: "e203ec5f-dd45-44bb-97b2-fd8a548ce231"). InnerVolumeSpecName "kube-api-access-p2nmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.788339 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e203ec5f-dd45-44bb-97b2-fd8a548ce231" (UID: "e203ec5f-dd45-44bb-97b2-fd8a548ce231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.848521 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.848565 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e203ec5f-dd45-44bb-97b2-fd8a548ce231-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:07 crc kubenswrapper[4895]: I1202 07:49:07.848580 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2nmk\" (UniqueName: \"kubernetes.io/projected/e203ec5f-dd45-44bb-97b2-fd8a548ce231-kube-api-access-p2nmk\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.263041 4895 generic.go:334] "Generic (PLEG): container finished" podID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerID="87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1" exitCode=0 Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.263127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerDied","Data":"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1"} Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.263148 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpwvp" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.263192 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpwvp" event={"ID":"e203ec5f-dd45-44bb-97b2-fd8a548ce231","Type":"ContainerDied","Data":"63e3069f74e0c2f5114d28209eb436a5390bf351b874275c933348a22af2709a"} Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.263233 4895 scope.go:117] "RemoveContainer" containerID="87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.331367 4895 scope.go:117] "RemoveContainer" containerID="420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.340132 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.346625 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpwvp"] Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.362350 4895 scope.go:117] "RemoveContainer" containerID="ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.403784 4895 scope.go:117] "RemoveContainer" containerID="87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1" Dec 02 07:49:08 crc kubenswrapper[4895]: E1202 07:49:08.404776 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1\": container with ID starting with 87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1 not found: ID does not exist" containerID="87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.404913 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1"} err="failed to get container status \"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1\": rpc error: code = NotFound desc = could not find container \"87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1\": container with ID starting with 87ac20d95fd6a459d393ff2c140af98b6f77476df00b59a3a877db168b3478f1 not found: ID does not exist" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.405010 4895 scope.go:117] "RemoveContainer" containerID="420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463" Dec 02 07:49:08 crc kubenswrapper[4895]: E1202 07:49:08.405646 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463\": container with ID starting with 420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463 not found: ID does not exist" containerID="420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.405693 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463"} err="failed to get container status \"420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463\": rpc error: code = NotFound desc = could not find container \"420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463\": container with ID starting with 420c0838ecd0aae08b37769c218f02c3925b9911317837e7d544bfd4a42c3463 not found: ID does not exist" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.405729 4895 scope.go:117] "RemoveContainer" containerID="ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3" Dec 02 07:49:08 crc kubenswrapper[4895]: E1202 07:49:08.406025 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3\": container with ID starting with ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3 not found: ID does not exist" containerID="ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3" Dec 02 07:49:08 crc kubenswrapper[4895]: I1202 07:49:08.406074 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3"} err="failed to get container status \"ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3\": rpc error: code = NotFound desc = could not find container \"ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3\": container with ID starting with ac87a2d485aad6118b59d6c284310ff45daf6ef1a233203c74c4b8a0fe1c07d3 not found: ID does not exist" Dec 02 07:49:09 crc kubenswrapper[4895]: I1202 07:49:09.172828 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" path="/var/lib/kubelet/pods/e203ec5f-dd45-44bb-97b2-fd8a548ce231/volumes" Dec 02 07:49:09 crc kubenswrapper[4895]: I1202 07:49:09.173694 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:49:09 crc kubenswrapper[4895]: I1202 07:49:09.253874 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.871387 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.872107 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.872917 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.873027 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.873460 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.877025 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.881527 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:09 crc kubenswrapper[4895]: E1202 07:49:09.881712 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:49:10 crc kubenswrapper[4895]: I1202 07:49:10.003124 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:49:10 crc kubenswrapper[4895]: E1202 07:49:10.303701 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:10 crc kubenswrapper[4895]: E1202 07:49:10.303860 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:49:26.303830379 +0000 UTC m=+1577.474690022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:49:10 crc kubenswrapper[4895]: I1202 07:49:10.316975 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wjx7g" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="registry-server" containerID="cri-o://867d58af9aa78f6e2b186a556b4dd87418cbe07902e00bd2851b918d8af40dcf" gracePeriod=2 Dec 02 07:49:10 crc kubenswrapper[4895]: E1202 07:49:10.609035 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:10 crc kubenswrapper[4895]: E1202 07:49:10.609177 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:26.609145585 +0000 UTC m=+1577.780005238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.352772 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerID="867d58af9aa78f6e2b186a556b4dd87418cbe07902e00bd2851b918d8af40dcf" exitCode=0 Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.352841 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerDied","Data":"867d58af9aa78f6e2b186a556b4dd87418cbe07902e00bd2851b918d8af40dcf"} Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.403893 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.527399 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swqwh\" (UniqueName: \"kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh\") pod \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.528327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities\") pod \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.528549 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content\") pod \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\" (UID: \"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9\") " Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.529458 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities" (OuterVolumeSpecName: "utilities") pod "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" (UID: "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.535162 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh" (OuterVolumeSpecName: "kube-api-access-swqwh") pod "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" (UID: "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9"). InnerVolumeSpecName "kube-api-access-swqwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.589471 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" (UID: "6ebf9714-5e6d-415c-a0aa-adab0d3e46e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.630568 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.630641 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swqwh\" (UniqueName: \"kubernetes.io/projected/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-kube-api-access-swqwh\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:11 crc kubenswrapper[4895]: I1202 07:49:11.630664 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.374312 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wjx7g" event={"ID":"6ebf9714-5e6d-415c-a0aa-adab0d3e46e9","Type":"ContainerDied","Data":"a22f4c905d0801dd52d8abeba3d1d1cda84aa90931f8f427a2eaefb256ac2937"} Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.374477 4895 scope.go:117] "RemoveContainer" containerID="867d58af9aa78f6e2b186a556b4dd87418cbe07902e00bd2851b918d8af40dcf" Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.374860 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wjx7g" Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.420372 4895 scope.go:117] "RemoveContainer" containerID="4dc420dbf673d00f97311d57a4404d16d8b6c032b5b89c29e8505019899d42c9" Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.452679 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.460879 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wjx7g"] Dec 02 07:49:12 crc kubenswrapper[4895]: I1202 07:49:12.473517 4895 scope.go:117] "RemoveContainer" containerID="cc3ec4d62ef18a3145302b1f913c2b11bc95cfa5e826aece7c00bbdc8aea0e34" Dec 02 07:49:13 crc kubenswrapper[4895]: I1202 07:49:13.158543 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" path="/var/lib/kubelet/pods/6ebf9714-5e6d-415c-a0aa-adab0d3e46e9/volumes" Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.871688 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.873435 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.873583 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.874021 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.874049 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.875232 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.878130 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 02 07:49:14 crc kubenswrapper[4895]: E1202 07:49:14.878167 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9vczq" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.046415 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9vczq_6b463255-a237-46b0-826d-1e6fc849f0aa/ovs-vswitchd/0.log" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.048481 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.159212 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180475 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vwv6\" (UniqueName: \"kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180535 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180648 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180673 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180780 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180908 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log\") pod \"6b463255-a237-46b0-826d-1e6fc849f0aa\" (UID: \"6b463255-a237-46b0-826d-1e6fc849f0aa\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180769 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180808 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib" (OuterVolumeSpecName: "var-lib") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.180837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run" (OuterVolumeSpecName: "var-run") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.181180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log" (OuterVolumeSpecName: "var-log") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.182202 4895 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.182224 4895 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.182237 4895 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-lib\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.182252 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b463255-a237-46b0-826d-1e6fc849f0aa-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.182464 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts" (OuterVolumeSpecName: "scripts") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.190561 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6" (OuterVolumeSpecName: "kube-api-access-8vwv6") pod "6b463255-a237-46b0-826d-1e6fc849f0aa" (UID: "6b463255-a237-46b0-826d-1e6fc849f0aa"). InnerVolumeSpecName "kube-api-access-8vwv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.283579 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.283734 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock\") pod \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.283828 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz7qf\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf\") pod \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.283852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache\") pod \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.283893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") pod \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\" (UID: \"11b8ece5-4192-4e13-a1c7-86ed3c627ddf\") " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.284259 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b463255-a237-46b0-826d-1e6fc849f0aa-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.284287 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vwv6\" (UniqueName: \"kubernetes.io/projected/6b463255-a237-46b0-826d-1e6fc849f0aa-kube-api-access-8vwv6\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.285157 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock" (OuterVolumeSpecName: "lock") pod "11b8ece5-4192-4e13-a1c7-86ed3c627ddf" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.285196 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache" (OuterVolumeSpecName: "cache") pod "11b8ece5-4192-4e13-a1c7-86ed3c627ddf" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.288666 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf" (OuterVolumeSpecName: "kube-api-access-pz7qf") pod "11b8ece5-4192-4e13-a1c7-86ed3c627ddf" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf"). InnerVolumeSpecName "kube-api-access-pz7qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.289237 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "11b8ece5-4192-4e13-a1c7-86ed3c627ddf" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.289469 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "11b8ece5-4192-4e13-a1c7-86ed3c627ddf" (UID: "11b8ece5-4192-4e13-a1c7-86ed3c627ddf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.386621 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.386695 4895 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-lock\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.386721 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz7qf\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-kube-api-access-pz7qf\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.386765 4895 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-cache\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.386784 4895 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/11b8ece5-4192-4e13-a1c7-86ed3c627ddf-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.418718 4895 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.464601 4895 generic.go:334] "Generic (PLEG): container finished" podID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerID="7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e" exitCode=137 Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.464731 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e"} Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.464818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"11b8ece5-4192-4e13-a1c7-86ed3c627ddf","Type":"ContainerDied","Data":"cfa04424dbc0599f02e0955508bdf471dbc21a51487954c16d2f02e8491eeb11"} Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.464857 4895 scope.go:117] "RemoveContainer" containerID="7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.465535 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.469083 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9vczq_6b463255-a237-46b0-826d-1e6fc849f0aa/ovs-vswitchd/0.log" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.470921 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" exitCode=137 Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.470999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerDied","Data":"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff"} Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.471066 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9vczq" event={"ID":"6b463255-a237-46b0-826d-1e6fc849f0aa","Type":"ContainerDied","Data":"25ce4a1dff9b5389e15afb76acc5a9e737daad17a6b331ac30f0a60b0dd0a16e"} Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.471213 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9vczq" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.488532 4895 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.499777 4895 scope.go:117] "RemoveContainer" containerID="5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.537784 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.545911 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-9vczq"] Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.546504 4895 scope.go:117] "RemoveContainer" containerID="a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.556865 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.565827 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.574379 4895 scope.go:117] "RemoveContainer" containerID="81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.628862 4895 scope.go:117] "RemoveContainer" containerID="79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.650263 4895 scope.go:117] "RemoveContainer" containerID="d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.672203 4895 scope.go:117] "RemoveContainer" containerID="d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.693671 4895 scope.go:117] "RemoveContainer" containerID="888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.720856 4895 scope.go:117] "RemoveContainer" containerID="e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.744708 4895 scope.go:117] "RemoveContainer" containerID="089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.766481 4895 scope.go:117] "RemoveContainer" containerID="fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.790374 4895 scope.go:117] "RemoveContainer" containerID="f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.832845 4895 scope.go:117] "RemoveContainer" containerID="0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.861400 4895 scope.go:117] "RemoveContainer" containerID="0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.898108 4895 scope.go:117] "RemoveContainer" containerID="8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.934231 4895 scope.go:117] "RemoveContainer" containerID="7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.935270 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e\": container with ID starting with 7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e not found: ID does not exist" containerID="7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.935345 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e"} err="failed to get container status \"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e\": rpc error: code = NotFound desc = could not find container \"7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e\": container with ID starting with 7c1e2c74168dac752cdee201c2e0c1b2faf7132d8e553780fa6dba40aeeeaa1e not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.935398 4895 scope.go:117] "RemoveContainer" containerID="5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.936060 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4\": container with ID starting with 5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4 not found: ID does not exist" containerID="5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.936126 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4"} err="failed to get container status \"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4\": rpc error: code = NotFound desc = could not find container \"5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4\": container with ID starting with 5ced108d1ab8442c1fac1fe0fc3c7939f98a90c737db7a6f1aced0c0edb070a4 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.936223 4895 scope.go:117] "RemoveContainer" containerID="a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.936671 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0\": container with ID starting with a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0 not found: ID does not exist" containerID="a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.936706 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0"} err="failed to get container status \"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0\": rpc error: code = NotFound desc = could not find container \"a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0\": container with ID starting with a772c3088bf7934e4656b200f802e6851dd25d0e8355f14cbbc3a035463513c0 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.936957 4895 scope.go:117] "RemoveContainer" containerID="81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.937695 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda\": container with ID starting with 81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda not found: ID does not exist" containerID="81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.937781 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda"} err="failed to get container status \"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda\": rpc error: code = NotFound desc = could not find container \"81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda\": container with ID starting with 81270f78df9b0d8cee1ae380d9bd934ce978faa7f9860ae475f4316e91185bda not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.937828 4895 scope.go:117] "RemoveContainer" containerID="79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.938548 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615\": container with ID starting with 79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615 not found: ID does not exist" containerID="79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.938584 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615"} err="failed to get container status \"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615\": rpc error: code = NotFound desc = could not find container \"79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615\": container with ID starting with 79658c290b2b8a920d6b1879c4cfd278d8b997d5f9b72c39a6d6c7310f20f615 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.938605 4895 scope.go:117] "RemoveContainer" containerID="d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.939186 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4\": container with ID starting with d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4 not found: ID does not exist" containerID="d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.939281 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4"} err="failed to get container status \"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4\": rpc error: code = NotFound desc = could not find container \"d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4\": container with ID starting with d0ab0e1bcdef49a9178a125186928df5512dbbec58106b2c12e7d8acd9b931e4 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.939301 4895 scope.go:117] "RemoveContainer" containerID="d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.939819 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b\": container with ID starting with d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b not found: ID does not exist" containerID="d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.939846 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b"} err="failed to get container status \"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b\": rpc error: code = NotFound desc = could not find container \"d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b\": container with ID starting with d16cb117b475cbe7eca7173bb117167934dc524dc42dabe3df6e81fc2b6e379b not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.939862 4895 scope.go:117] "RemoveContainer" containerID="888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.940415 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d\": container with ID starting with 888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d not found: ID does not exist" containerID="888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.940481 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d"} err="failed to get container status \"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d\": rpc error: code = NotFound desc = could not find container \"888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d\": container with ID starting with 888d3356ae1d79bdd97a607512a80b88b92e4f4d410a00d50371b35e64a5142d not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.940523 4895 scope.go:117] "RemoveContainer" containerID="e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.940972 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8\": container with ID starting with e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8 not found: ID does not exist" containerID="e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.941061 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8"} err="failed to get container status \"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8\": rpc error: code = NotFound desc = could not find container \"e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8\": container with ID starting with e3b88de13161d2c3c54de60370d9bf827fce15fa277067e0d26ffdbf3decddf8 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.941080 4895 scope.go:117] "RemoveContainer" containerID="089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.941510 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452\": container with ID starting with 089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452 not found: ID does not exist" containerID="089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.941570 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452"} err="failed to get container status \"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452\": rpc error: code = NotFound desc = could not find container \"089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452\": container with ID starting with 089c0b3d1c4ddc2fa892f504974b05872110b8d4c58cb70cafbaa74e93b8f452 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.941617 4895 scope.go:117] "RemoveContainer" containerID="fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.942089 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962\": container with ID starting with fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962 not found: ID does not exist" containerID="fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942124 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962"} err="failed to get container status \"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962\": rpc error: code = NotFound desc = could not find container \"fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962\": container with ID starting with fc2eeaa58100482b1a1ad56b93b5adeb32cce704c0f70987468501c900cb3962 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942147 4895 scope.go:117] "RemoveContainer" containerID="f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.942479 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc\": container with ID starting with f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc not found: ID does not exist" containerID="f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942519 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc"} err="failed to get container status \"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc\": rpc error: code = NotFound desc = could not find container \"f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc\": container with ID starting with f6567d126c5ab9260bebb4b4a822d71e559ec543cfdf3fa7202150e3115569cc not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942544 4895 scope.go:117] "RemoveContainer" containerID="0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.942866 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a\": container with ID starting with 0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a not found: ID does not exist" containerID="0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942893 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a"} err="failed to get container status \"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a\": rpc error: code = NotFound desc = could not find container \"0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a\": container with ID starting with 0b8c5691b63ae4c345789ff614121edd0fbac8d28ec4dd714cbad15af4ead78a not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.942910 4895 scope.go:117] "RemoveContainer" containerID="0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.943237 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8\": container with ID starting with 0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8 not found: ID does not exist" containerID="0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.943268 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8"} err="failed to get container status \"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8\": rpc error: code = NotFound desc = could not find container \"0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8\": container with ID starting with 0b782ab48a476bfcb22366e4a8e52dc20222254cc4ec4ca87a05e85213f1e6e8 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.943287 4895 scope.go:117] "RemoveContainer" containerID="8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4" Dec 02 07:49:17 crc kubenswrapper[4895]: E1202 07:49:17.943620 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4\": container with ID starting with 8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4 not found: ID does not exist" containerID="8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.943654 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4"} err="failed to get container status \"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4\": rpc error: code = NotFound desc = could not find container \"8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4\": container with ID starting with 8d7d30533f5cf82d2d0c96a4a07759e65bd3a99d6c9ea5aff2ebef3f2b8c14c4 not found: ID does not exist" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.943675 4895 scope.go:117] "RemoveContainer" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.971408 4895 scope.go:117] "RemoveContainer" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" Dec 02 07:49:17 crc kubenswrapper[4895]: I1202 07:49:17.994417 4895 scope.go:117] "RemoveContainer" containerID="0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.036484 4895 scope.go:117] "RemoveContainer" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" Dec 02 07:49:18 crc kubenswrapper[4895]: E1202 07:49:18.037168 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff\": container with ID starting with 7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff not found: ID does not exist" containerID="7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.037217 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff"} err="failed to get container status \"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff\": rpc error: code = NotFound desc = could not find container \"7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff\": container with ID starting with 7cc0795476568fadf42dedac218a2fe6065e25675de99e96dafc929b4ec7b9ff not found: ID does not exist" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.037253 4895 scope.go:117] "RemoveContainer" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" Dec 02 07:49:18 crc kubenswrapper[4895]: E1202 07:49:18.037999 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd\": container with ID starting with 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd not found: ID does not exist" containerID="6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.038029 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd"} err="failed to get container status \"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd\": rpc error: code = NotFound desc = could not find container \"6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd\": container with ID starting with 6aa3fa36a02a2278bdcd14541013304e85cab914b53ba0de74342a5fb50d00cd not found: ID does not exist" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.038048 4895 scope.go:117] "RemoveContainer" containerID="0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a" Dec 02 07:49:18 crc kubenswrapper[4895]: E1202 07:49:18.038680 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a\": container with ID starting with 0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a not found: ID does not exist" containerID="0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a" Dec 02 07:49:18 crc kubenswrapper[4895]: I1202 07:49:18.038707 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a"} err="failed to get container status \"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a\": rpc error: code = NotFound desc = could not find container \"0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a\": container with ID starting with 0ede9d97beff32f5baa392f8045826873652cfccb4629442677b4573fc94434a not found: ID does not exist" Dec 02 07:49:19 crc kubenswrapper[4895]: I1202 07:49:19.161510 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" path="/var/lib/kubelet/pods/11b8ece5-4192-4e13-a1c7-86ed3c627ddf/volumes" Dec 02 07:49:19 crc kubenswrapper[4895]: I1202 07:49:19.167338 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" path="/var/lib/kubelet/pods/6b463255-a237-46b0-826d-1e6fc849f0aa/volumes" Dec 02 07:49:20 crc kubenswrapper[4895]: I1202 07:49:20.141349 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:49:20 crc kubenswrapper[4895]: E1202 07:49:20.141850 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.499651 4895 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0c6d90b_6e06_4b01_a8d7_5761b6cb5c0f.slice" Dec 02 07:49:22 crc kubenswrapper[4895]: E1202 07:49:22.500283 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0c6d90b_6e06_4b01_a8d7_5761b6cb5c0f.slice" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.557300 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64685599d6-tgrm9" Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.602039 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.614792 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-64685599d6-tgrm9"] Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.678374 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.174:9292/healthcheck\": dial tcp 10.217.0.174:9292: i/o timeout" Dec 02 07:49:22 crc kubenswrapper[4895]: I1202 07:49:22.680723 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:49:23 crc kubenswrapper[4895]: I1202 07:49:23.157167 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f" path="/var/lib/kubelet/pods/e0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f/volumes" Dec 02 07:49:23 crc kubenswrapper[4895]: I1202 07:49:23.469676 4895 scope.go:117] "RemoveContainer" containerID="7e1fc19a4fb8bbfeda2cc4b937706c5f0d0cf2fabcee3636fcd1a49acddeb02d" Dec 02 07:49:23 crc kubenswrapper[4895]: I1202 07:49:23.680386 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-ddf8948cc-h2bbh" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:49:23 crc kubenswrapper[4895]: I1202 07:49:23.710113 4895 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode0c6d90b-6e06-4b01-a8d7-5761b6cb5c0f] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0c6d90b_6e06_4b01_a8d7_5761b6cb5c0f.slice" Dec 02 07:49:25 crc kubenswrapper[4895]: I1202 07:49:25.760031 4895 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podace60b46-ed73-43ba-8d95-b81b03a6bd0a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podace60b46-ed73-43ba-8d95-b81b03a6bd0a] : Timed out while waiting for systemd to remove kubepods-besteffort-podace60b46_ed73_43ba_8d95_b81b03a6bd0a.slice" Dec 02 07:49:25 crc kubenswrapper[4895]: E1202 07:49:25.760093 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podace60b46-ed73-43ba-8d95-b81b03a6bd0a] : unable to destroy cgroup paths for cgroup [kubepods besteffort podace60b46-ed73-43ba-8d95-b81b03a6bd0a] : Timed out while waiting for systemd to remove kubepods-besteffort-podace60b46_ed73_43ba_8d95_b81b03a6bd0a.slice" pod="openstack/openstack-cell1-galera-0" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" Dec 02 07:49:26 crc kubenswrapper[4895]: E1202 07:49:26.377305 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:26 crc kubenswrapper[4895]: E1202 07:49:26.377421 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts podName:5696e7d9-103a-4bf7-9b05-1959e92cf46a nodeName:}" failed. No retries permitted until 2025-12-02 07:49:58.3773993 +0000 UTC m=+1609.548258933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts") pod "novacell0b7d1-account-delete-wchwk" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a") : configmap "openstack-scripts" not found Dec 02 07:49:26 crc kubenswrapper[4895]: I1202 07:49:26.612040 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 07:49:26 crc kubenswrapper[4895]: I1202 07:49:26.643516 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:49:26 crc kubenswrapper[4895]: I1202 07:49:26.649191 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 07:49:26 crc kubenswrapper[4895]: E1202 07:49:26.680168 4895 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 02 07:49:26 crc kubenswrapper[4895]: E1202 07:49:26.680286 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts podName:96831697-ba2e-477e-954f-e4ad0cf30f92 nodeName:}" failed. No retries permitted until 2025-12-02 07:49:58.68025851 +0000 UTC m=+1609.851118133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts") pod "novaapi23cb-account-delete-g8msv" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92") : configmap "openstack-scripts" not found Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.152180 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" path="/var/lib/kubelet/pods/ace60b46-ed73-43ba-8d95-b81b03a6bd0a/volumes" Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.647796 4895 generic.go:334] "Generic (PLEG): container finished" podID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" containerID="0c2c388094b95cef4d9070468d30cc3bb7a5071f95547b2ee0b18119aa7ce3f9" exitCode=137 Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.647890 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b7d1-account-delete-wchwk" event={"ID":"5696e7d9-103a-4bf7-9b05-1959e92cf46a","Type":"ContainerDied","Data":"0c2c388094b95cef4d9070468d30cc3bb7a5071f95547b2ee0b18119aa7ce3f9"} Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.651258 4895 generic.go:334] "Generic (PLEG): container finished" podID="96831697-ba2e-477e-954f-e4ad0cf30f92" containerID="0ea2b37615e5717b134e70582d30afd9a8248506c11d128a320a1ec2c2f21f39" exitCode=137 Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.651350 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi23cb-account-delete-g8msv" event={"ID":"96831697-ba2e-477e-954f-e4ad0cf30f92","Type":"ContainerDied","Data":"0ea2b37615e5717b134e70582d30afd9a8248506c11d128a320a1ec2c2f21f39"} Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.980938 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:49:27 crc kubenswrapper[4895]: I1202 07:49:27.990052 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.012934 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts\") pod \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.013057 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb4hb\" (UniqueName: \"kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb\") pod \"96831697-ba2e-477e-954f-e4ad0cf30f92\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.013128 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts\") pod \"96831697-ba2e-477e-954f-e4ad0cf30f92\" (UID: \"96831697-ba2e-477e-954f-e4ad0cf30f92\") " Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.013248 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4mwj\" (UniqueName: \"kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj\") pod \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\" (UID: \"5696e7d9-103a-4bf7-9b05-1959e92cf46a\") " Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.014185 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5696e7d9-103a-4bf7-9b05-1959e92cf46a" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.014232 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96831697-ba2e-477e-954f-e4ad0cf30f92" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.063704 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb" (OuterVolumeSpecName: "kube-api-access-bb4hb") pod "96831697-ba2e-477e-954f-e4ad0cf30f92" (UID: "96831697-ba2e-477e-954f-e4ad0cf30f92"). InnerVolumeSpecName "kube-api-access-bb4hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.063966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj" (OuterVolumeSpecName: "kube-api-access-z4mwj") pod "5696e7d9-103a-4bf7-9b05-1959e92cf46a" (UID: "5696e7d9-103a-4bf7-9b05-1959e92cf46a"). InnerVolumeSpecName "kube-api-access-z4mwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.116072 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb4hb\" (UniqueName: \"kubernetes.io/projected/96831697-ba2e-477e-954f-e4ad0cf30f92-kube-api-access-bb4hb\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.116123 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96831697-ba2e-477e-954f-e4ad0cf30f92-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.116137 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4mwj\" (UniqueName: \"kubernetes.io/projected/5696e7d9-103a-4bf7-9b05-1959e92cf46a-kube-api-access-z4mwj\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.116149 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5696e7d9-103a-4bf7-9b05-1959e92cf46a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.672177 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi23cb-account-delete-g8msv" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.674030 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi23cb-account-delete-g8msv" event={"ID":"96831697-ba2e-477e-954f-e4ad0cf30f92","Type":"ContainerDied","Data":"6a7ef559b071bec63c6a6f0c36f0541136299e206893522ad1b6e213a924da0a"} Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.674460 4895 scope.go:117] "RemoveContainer" containerID="0ea2b37615e5717b134e70582d30afd9a8248506c11d128a320a1ec2c2f21f39" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.679333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0b7d1-account-delete-wchwk" event={"ID":"5696e7d9-103a-4bf7-9b05-1959e92cf46a","Type":"ContainerDied","Data":"25f7933f00f9b17f29235bd1a7b5edcd4d3fbcb2a26e043877c66f025f2ac33d"} Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.679441 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0b7d1-account-delete-wchwk" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.712024 4895 scope.go:117] "RemoveContainer" containerID="0c2c388094b95cef4d9070468d30cc3bb7a5071f95547b2ee0b18119aa7ce3f9" Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.726311 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.747199 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi23cb-account-delete-g8msv"] Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.756949 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:49:28 crc kubenswrapper[4895]: I1202 07:49:28.768172 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0b7d1-account-delete-wchwk"] Dec 02 07:49:29 crc kubenswrapper[4895]: I1202 07:49:29.156447 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" path="/var/lib/kubelet/pods/5696e7d9-103a-4bf7-9b05-1959e92cf46a/volumes" Dec 02 07:49:29 crc kubenswrapper[4895]: I1202 07:49:29.158113 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96831697-ba2e-477e-954f-e4ad0cf30f92" path="/var/lib/kubelet/pods/96831697-ba2e-477e-954f-e4ad0cf30f92/volumes" Dec 02 07:49:31 crc kubenswrapper[4895]: I1202 07:49:31.141962 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:49:31 crc kubenswrapper[4895]: E1202 07:49:31.143113 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:49:44 crc kubenswrapper[4895]: I1202 07:49:44.141268 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:49:44 crc kubenswrapper[4895]: E1202 07:49:44.142302 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:49:57 crc kubenswrapper[4895]: I1202 07:49:57.140942 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:49:57 crc kubenswrapper[4895]: E1202 07:49:57.141904 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:50:12 crc kubenswrapper[4895]: I1202 07:50:12.142195 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:50:12 crc kubenswrapper[4895]: E1202 07:50:12.143807 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:50:23 crc kubenswrapper[4895]: I1202 07:50:23.973303 4895 scope.go:117] "RemoveContainer" containerID="7403a5b9ce852233942b59a898f050425071a9defb4aef479e474d871e9de273" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.013845 4895 scope.go:117] "RemoveContainer" containerID="749c0f6ea01ac411d0209d4472b7bd79cfc38bd8f584ebdd6968b35f5d12cdc7" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.058393 4895 scope.go:117] "RemoveContainer" containerID="4fdd958fc1822c12a2d4aca9d8bd5fd877dcad2ca93c61ee85e2640247da17f0" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.086402 4895 scope.go:117] "RemoveContainer" containerID="15c0a8a60d6e51b79c5a48224057195f23217f2a902e4569436fc6187c88a4ee" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.120427 4895 scope.go:117] "RemoveContainer" containerID="9e4586f8b3fb6ca58d5504dd173c8353883757566be74d1cb6c65e2158e6f973" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.141587 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:50:24 crc kubenswrapper[4895]: E1202 07:50:24.141934 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.152994 4895 scope.go:117] "RemoveContainer" containerID="8a2f68e796838e87e45698b40183c455794d730caf5af19a07c35fd150b09fe1" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.192852 4895 scope.go:117] "RemoveContainer" containerID="7ae3e5d5ec8de27bf0dd2f2d640b40e84c4d95a0e1317e0fee0317f8b1e9f187" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.218588 4895 scope.go:117] "RemoveContainer" containerID="699083b59cc3e89c8cdcea80a7f38a966d522c9c52625be50b3fa816e59f7830" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.283051 4895 scope.go:117] "RemoveContainer" containerID="28f9cb6d02e60c3a6d26b50a6fa46604e2e69011e552700cbb792dbf252b2632" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.311919 4895 scope.go:117] "RemoveContainer" containerID="d8cc3e500cc7cf167ba6655926e2bd0f9e523259b1d217e4a231d4180d10b525" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.345464 4895 scope.go:117] "RemoveContainer" containerID="71bd075d30ee48222b192e19ea3e173bfae0e94488a7e8ecbc6fd0d9989b9830" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.373340 4895 scope.go:117] "RemoveContainer" containerID="825f000e90e467b37377e382a45ce9ec58ad6ced7e5a761f9a5ac0cc1b0ded3d" Dec 02 07:50:24 crc kubenswrapper[4895]: I1202 07:50:24.402183 4895 scope.go:117] "RemoveContainer" containerID="39340c4fd973c571bd458064ab8a8ad372022cf6e584357ba7e6b31eaf6221a0" Dec 02 07:50:35 crc kubenswrapper[4895]: I1202 07:50:35.142251 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:50:35 crc kubenswrapper[4895]: E1202 07:50:35.143972 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:50:48 crc kubenswrapper[4895]: I1202 07:50:48.141863 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:50:48 crc kubenswrapper[4895]: E1202 07:50:48.143112 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:51:02 crc kubenswrapper[4895]: I1202 07:51:02.141333 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:51:02 crc kubenswrapper[4895]: E1202 07:51:02.142410 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:51:17 crc kubenswrapper[4895]: I1202 07:51:17.142207 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:51:17 crc kubenswrapper[4895]: E1202 07:51:17.143610 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.676052 4895 scope.go:117] "RemoveContainer" containerID="a6b05ad2818111c94be943845c6204ff5c39fb35c07db9ae40f5c8e318b1f644" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.717199 4895 scope.go:117] "RemoveContainer" containerID="0cea6d2353c72398e48c1c3d3ded7a0154a5a87ae593e426abedaa9a74830e8a" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.764562 4895 scope.go:117] "RemoveContainer" containerID="020f73bbd49e945d5c90d4d98cfbb78206c4dba84b8249b508b6c2f0f41eb7e3" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.821834 4895 scope.go:117] "RemoveContainer" containerID="4bce6feae18b88a0dade864ed7f4db319704698221a61e3defcf26b5f9e0a73e" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.843992 4895 scope.go:117] "RemoveContainer" containerID="07083e71540680643e55a6b2c8400f1fab96294f90701438ad40cbeb3539c27f" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.868436 4895 scope.go:117] "RemoveContainer" containerID="9f87395686eb4293111dd47a55d66e6fd9c827da84446d2ce43c4aa195645589" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.928210 4895 scope.go:117] "RemoveContainer" containerID="34ab968b34804011274e618923c761b412f63061861d97d3ae783f4629e6063e" Dec 02 07:51:24 crc kubenswrapper[4895]: I1202 07:51:24.975696 4895 scope.go:117] "RemoveContainer" containerID="ab55d7c053fe9195f13a2d5dd467990069644dc38f18867902edf1e3259825f1" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.002447 4895 scope.go:117] "RemoveContainer" containerID="79507980e01b07ea773d434932da83cc407f386cc2f4f05c605e4f8341d7bef2" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.022977 4895 scope.go:117] "RemoveContainer" containerID="c280f2831c6a36b2ad18f9301b8a0472b08702b0278b211544534ea69aa1a406" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.049575 4895 scope.go:117] "RemoveContainer" containerID="9352b834616a69ecbcd66b6e814ff88f5658fd5608184279b61d4a311c968b79" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.117260 4895 scope.go:117] "RemoveContainer" containerID="7502623bd7676b5c87fc1f5d8cba25df3796cbe037e5edeae59bd59e11e37241" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.155431 4895 scope.go:117] "RemoveContainer" containerID="25d16ba51aa46d2a8ba22560c34b650e3b46eec242185ecc21922126defa6823" Dec 02 07:51:25 crc kubenswrapper[4895]: I1202 07:51:25.214205 4895 scope.go:117] "RemoveContainer" containerID="29cac239a36ec47979b458bd7cbe4f5af5cb9aa6e860de93677ffafa0b13e0a8" Dec 02 07:51:30 crc kubenswrapper[4895]: I1202 07:51:30.140820 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:51:30 crc kubenswrapper[4895]: E1202 07:51:30.141707 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:51:45 crc kubenswrapper[4895]: I1202 07:51:45.141685 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:51:45 crc kubenswrapper[4895]: E1202 07:51:45.142721 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:51:57 crc kubenswrapper[4895]: I1202 07:51:57.142574 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:51:57 crc kubenswrapper[4895]: E1202 07:51:57.143569 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:52:09 crc kubenswrapper[4895]: I1202 07:52:09.146080 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:52:09 crc kubenswrapper[4895]: E1202 07:52:09.146946 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:52:21 crc kubenswrapper[4895]: I1202 07:52:21.141914 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:52:21 crc kubenswrapper[4895]: E1202 07:52:21.142857 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.075699 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076453 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076472 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076489 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f28e5fd3-456b-4960-a3a9-1134e3eecb1f" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076497 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f28e5fd3-456b-4960-a3a9-1134e3eecb1f" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076515 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076524 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076537 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15097a8-ac9a-4886-a839-272b662561c5" containerName="memcached" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076544 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15097a8-ac9a-4886-a839-272b662561c5" containerName="memcached" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076556 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076568 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076579 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="rsync" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076586 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="rsync" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076599 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076606 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076619 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076626 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076639 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076647 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076662 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076669 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076681 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerName="nova-scheduler-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076689 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerName="nova-scheduler-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076707 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076714 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076725 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076733 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076762 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076772 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076784 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076791 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076800 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-expirer" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076810 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-expirer" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076822 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076829 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076843 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7067a12f-0245-45f5-a806-591d5999c7f0" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076851 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7067a12f-0245-45f5-a806-591d5999c7f0" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076863 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076870 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076884 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="sg-core" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076892 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="sg-core" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076906 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-reaper" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076914 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-reaper" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076928 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d42411e0-2228-4a1a-9d31-e3788f2b1f0c" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076935 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42411e0-2228-4a1a-9d31-e3788f2b1f0c" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076946 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="mysql-bootstrap" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076954 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="mysql-bootstrap" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076965 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="cinder-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076973 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="cinder-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.076984 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.076993 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077006 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077014 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077026 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077034 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077047 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077054 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077070 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077078 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077088 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077096 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077108 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="extract-utilities" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077137 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="extract-utilities" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077152 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077160 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077169 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077177 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077187 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="openstack-network-exporter" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077195 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="openstack-network-exporter" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077206 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="swift-recon-cron" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077213 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="swift-recon-cron" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077220 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" containerName="keystone-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077228 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" containerName="keystone-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077239 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077247 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077260 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077268 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077280 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077310 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077319 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96831697-ba2e-477e-954f-e4ad0cf30f92" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077327 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="96831697-ba2e-477e-954f-e4ad0cf30f92" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077344 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077354 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077367 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="extract-utilities" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077378 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="extract-utilities" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077390 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077398 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077412 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077419 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077432 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="extract-content" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077440 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="extract-content" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077455 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="probe" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077464 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="probe" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077474 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077483 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077492 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077501 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077513 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="setup-container" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077521 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="setup-container" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077532 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-central-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077541 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-central-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077554 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-notification-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077565 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-notification-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077576 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077584 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-log" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077596 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077604 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077615 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077623 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077634 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077642 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077649 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3c2445-8bce-4d09-ad86-02c1ba6495fb" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077657 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3c2445-8bce-4d09-ad86-02c1ba6495fb" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077665 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server-init" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077674 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server-init" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077689 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077697 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077707 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cae5c9e-9159-4e78-9809-1801d0e35131" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077718 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cae5c9e-9159-4e78-9809-1801d0e35131" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077727 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077736 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077769 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="mysql-bootstrap" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077778 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="mysql-bootstrap" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077789 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077797 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077811 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="extract-content" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077819 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="extract-content" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077829 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077837 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077848 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" containerName="kube-state-metrics" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077855 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" containerName="kube-state-metrics" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077869 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="setup-container" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077877 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="setup-container" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077885 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077893 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077902 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077910 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077920 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077928 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077944 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077952 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077961 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077969 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077981 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.077988 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: E1202 07:52:24.077999 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078006 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078183 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078201 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078214 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078226 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="38385316-fca8-41b0-b0ff-570a9cd71e8a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078236 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078246 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078259 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a02963-abb5-4f29-aa82-88ba6f859a00" containerName="nova-cell0-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078271 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="84116ead-6214-4d5f-98a3-c89b08cf1306" containerName="ovn-controller" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078282 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078294 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078306 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="swift-recon-cron" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078316 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078329 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078341 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d2bb1c-bd20-473e-b91a-7e2a63ec9f07" containerName="barbican-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078350 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="sg-core" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078360 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="ovn-northd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078372 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovs-vswitchd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078379 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="290c1303-bf41-4474-86ff-c9f5aa105cc3" containerName="glance-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078386 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-notification-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078396 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cae5c9e-9159-4e78-9809-1801d0e35131" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078407 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078417 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078426 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7067a12f-0245-45f5-a806-591d5999c7f0" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078433 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b0ee49-bed2-4691-8160-2edbebda27b7" containerName="ceilometer-central-agent" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078441 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3c2445-8bce-4d09-ad86-02c1ba6495fb" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078454 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2c6f66-9fe8-4d15-92f6-f29493e6cd7b" containerName="nova-metadata-metadata" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078467 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d42411e0-2228-4a1a-9d31-e3788f2b1f0c" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078476 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15097a8-ac9a-4886-a839-272b662561c5" containerName="memcached" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078486 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="183c5216-30f9-4f75-865b-7f795ea149fb" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078499 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078509 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078516 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078524 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5ec753-410a-4d4b-8071-ce60970ba4df" containerName="neutron-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078536 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebf9714-5e6d-415c-a0aa-adab0d3e46e9" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078549 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="probe" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078561 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="836bba81-425e-4610-b191-2bbb2cfc1f79" containerName="cinder-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078569 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e203ec5f-dd45-44bb-97b2-fd8a548ce231" containerName="registry-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078580 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="96831697-ba2e-477e-954f-e4ad0cf30f92" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078591 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-auditor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078599 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078607 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="247c892c-e00a-474e-8022-73bd1b2249f3" containerName="keystone-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078615 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078628 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace60b46-ed73-43ba-8d95-b81b03a6bd0a" containerName="galera" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078637 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-updater" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078647 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078655 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762a68c-cabc-4842-844a-1db6710e3ee9" containerName="nova-api-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078666 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="object-expirer" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078673 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5696e7d9-103a-4bf7-9b05-1959e92cf46a" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078681 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3ec758-e19e-4286-bfed-a1d6d3010bfb" containerName="kube-state-metrics" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078687 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="446b5a26-8e57-4765-bb7d-275cf05996dd" containerName="openstack-network-exporter" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078697 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d1cb194-5325-40c2-bbd4-0a48821e12aa" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078704 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca98cba7-4127-4d25-a139-1a42224331f2" containerName="rabbitmq" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078712 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4869eb0-5e33-4837-8295-06ca17076e69" containerName="glance-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078720 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="container-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078731 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-reaper" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078756 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="68bddf66-0b9f-4bc8-916b-aa0abfbf13c3" containerName="placement-log" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078766 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f28e5fd3-456b-4960-a3a9-1134e3eecb1f" containerName="mariadb-account-delete" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078773 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="account-replicator" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078779 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b8ece5-4192-4e13-a1c7-86ed3c627ddf" containerName="rsync" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078789 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbed0ba-1d44-4421-a276-b075b0f31c3f" containerName="cinder-api" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078797 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="31223325-1372-4ea6-867e-f511b7dffc09" containerName="nova-cell1-conductor-conductor" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078806 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08915b6-6f79-40e4-8c26-d9f82606b4cc" containerName="nova-scheduler-scheduler" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078814 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b463255-a237-46b0-826d-1e6fc849f0aa" containerName="ovsdb-server" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.078822 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e9e481-0762-42a8-a25a-7d50500f1236" containerName="proxy-httpd" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.080052 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.089421 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.173256 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sgff\" (UniqueName: \"kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.173334 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.173439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.274465 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.274555 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sgff\" (UniqueName: \"kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.274586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.275093 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.275147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.295586 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sgff\" (UniqueName: \"kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff\") pod \"community-operators-gjtf4\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.401620 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:24 crc kubenswrapper[4895]: I1202 07:52:24.716164 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.284919 4895 generic.go:334] "Generic (PLEG): container finished" podID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerID="a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca" exitCode=0 Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.284985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerDied","Data":"a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca"} Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.285023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerStarted","Data":"eb62f49a90a395052afca95715bdffd5ca38a9b525c7d88124f59776c6cfafb4"} Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.287672 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.789208 4895 scope.go:117] "RemoveContainer" containerID="8a861d47b18ce485f266fa0a57adf3455c385cacb617b37e3b45a4bd17799c71" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.812247 4895 scope.go:117] "RemoveContainer" containerID="23cadf09ae70804eb20adc9739731c7b4ef414d2337accf34d11dc986b7b6ba7" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.853123 4895 scope.go:117] "RemoveContainer" containerID="e169ef89006889ec6af2f91025c33437d207c43337cbf66cac7f9e3e6b3263f5" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.871309 4895 scope.go:117] "RemoveContainer" containerID="bb176098e0c7a61181ec4600276b01e97b71134a0d909bf6ea15be259cecec59" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.888880 4895 scope.go:117] "RemoveContainer" containerID="949ad4d21813d885979595286daba6ad241d3bf3aac10ca8c334398ba63d2324" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.916388 4895 scope.go:117] "RemoveContainer" containerID="93b2d419eb18cfab0debf5c9a11d016c6acafb519c1028e43dceebb955ec1a84" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.952697 4895 scope.go:117] "RemoveContainer" containerID="f274cc78e83e7f731660b694da5330a7d62b23969ffd44e4119df3815dcb2352" Dec 02 07:52:25 crc kubenswrapper[4895]: I1202 07:52:25.977939 4895 scope.go:117] "RemoveContainer" containerID="d5777f068b1e0673ef51659de82f8858e811deeeea16f976ccf1ba303e0272c4" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.002486 4895 scope.go:117] "RemoveContainer" containerID="8905a04cd6af553d962d3110181cd121c314f07c69d9726566c2e6fedcbedc7d" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.039985 4895 scope.go:117] "RemoveContainer" containerID="19a1d8c117923ca651ad11ad738b188337a1af0826d51b2eb118181006dd5479" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.063073 4895 scope.go:117] "RemoveContainer" containerID="bd9e831f88d074ed4ebcb3f0c21947564533211ce824af698b616217e7b83e86" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.088217 4895 scope.go:117] "RemoveContainer" containerID="b0e5d2da099fb073b8b5423e92932a90a8ea91c926fc7b91aa4ebeabcd5e1f3b" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.149399 4895 scope.go:117] "RemoveContainer" containerID="fe38dca9f6627e9e19b2be20b54cb47cb1aee5e491dae454c261bcbe08243752" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.176840 4895 scope.go:117] "RemoveContainer" containerID="c0d40bd925f15211d99af8cacd53d2e85f799a87ce053777a156c72dcd0fd1bc" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.199175 4895 scope.go:117] "RemoveContainer" containerID="44ae8909515453d51c81fc2eab9723fc18e5cf8dc79ec16427db8d716e2d75dd" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.218611 4895 scope.go:117] "RemoveContainer" containerID="2404d0d162ba97497121e295a4d0041b66d86ff11fa14a769019cf11872671c2" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.249835 4895 scope.go:117] "RemoveContainer" containerID="64d55fa59ae42c2b94eedd5c0718c32785d6d8f8fd9e60167c590468901ed0c0" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.277125 4895 scope.go:117] "RemoveContainer" containerID="4362e47d57c98a2bd4e29c4d3aa4369c5e901649c4440c8bab3af675617ff778" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.302667 4895 scope.go:117] "RemoveContainer" containerID="3a0d36cdfb3f77e74dda0c49d0558e6c7571700d4bfd6cdaa1acbb5f35e6a972" Dec 02 07:52:26 crc kubenswrapper[4895]: I1202 07:52:26.313617 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerStarted","Data":"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628"} Dec 02 07:52:27 crc kubenswrapper[4895]: I1202 07:52:27.332353 4895 generic.go:334] "Generic (PLEG): container finished" podID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerID="5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628" exitCode=0 Dec 02 07:52:27 crc kubenswrapper[4895]: I1202 07:52:27.332425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerDied","Data":"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628"} Dec 02 07:52:28 crc kubenswrapper[4895]: I1202 07:52:28.344406 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerStarted","Data":"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31"} Dec 02 07:52:28 crc kubenswrapper[4895]: I1202 07:52:28.367901 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gjtf4" podStartSLOduration=1.714723231 podStartE2EDuration="4.367878397s" podCreationTimestamp="2025-12-02 07:52:24 +0000 UTC" firstStartedPulling="2025-12-02 07:52:25.287371294 +0000 UTC m=+1756.458230907" lastFinishedPulling="2025-12-02 07:52:27.94052643 +0000 UTC m=+1759.111386073" observedRunningTime="2025-12-02 07:52:28.365613898 +0000 UTC m=+1759.536473511" watchObservedRunningTime="2025-12-02 07:52:28.367878397 +0000 UTC m=+1759.538738020" Dec 02 07:52:34 crc kubenswrapper[4895]: I1202 07:52:34.402099 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:34 crc kubenswrapper[4895]: I1202 07:52:34.402596 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:34 crc kubenswrapper[4895]: I1202 07:52:34.449499 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:35 crc kubenswrapper[4895]: I1202 07:52:35.467880 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:35 crc kubenswrapper[4895]: I1202 07:52:35.520828 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:36 crc kubenswrapper[4895]: I1202 07:52:36.142070 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:52:36 crc kubenswrapper[4895]: E1202 07:52:36.142513 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:52:37 crc kubenswrapper[4895]: I1202 07:52:37.431684 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gjtf4" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="registry-server" containerID="cri-o://d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31" gracePeriod=2 Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.340397 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.441144 4895 generic.go:334] "Generic (PLEG): container finished" podID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerID="d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31" exitCode=0 Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.441216 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjtf4" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.441218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerDied","Data":"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31"} Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.441327 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjtf4" event={"ID":"956f3abe-0e48-45c8-88d3-52632967ccf1","Type":"ContainerDied","Data":"eb62f49a90a395052afca95715bdffd5ca38a9b525c7d88124f59776c6cfafb4"} Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.441354 4895 scope.go:117] "RemoveContainer" containerID="d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.459628 4895 scope.go:117] "RemoveContainer" containerID="5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.483569 4895 scope.go:117] "RemoveContainer" containerID="a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.507520 4895 scope.go:117] "RemoveContainer" containerID="d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31" Dec 02 07:52:38 crc kubenswrapper[4895]: E1202 07:52:38.508138 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31\": container with ID starting with d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31 not found: ID does not exist" containerID="d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.508190 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31"} err="failed to get container status \"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31\": rpc error: code = NotFound desc = could not find container \"d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31\": container with ID starting with d57025562ba3949d0a4988e60b87a92f5a33cbe71c9a80431fc7306d96185c31 not found: ID does not exist" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.508222 4895 scope.go:117] "RemoveContainer" containerID="5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628" Dec 02 07:52:38 crc kubenswrapper[4895]: E1202 07:52:38.508613 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628\": container with ID starting with 5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628 not found: ID does not exist" containerID="5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.508647 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628"} err="failed to get container status \"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628\": rpc error: code = NotFound desc = could not find container \"5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628\": container with ID starting with 5a67faf12edf71ed5fed90d564fd903ceb98c484811460ed9e1345bcd16ae628 not found: ID does not exist" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.508673 4895 scope.go:117] "RemoveContainer" containerID="a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca" Dec 02 07:52:38 crc kubenswrapper[4895]: E1202 07:52:38.509091 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca\": container with ID starting with a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca not found: ID does not exist" containerID="a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.509120 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca"} err="failed to get container status \"a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca\": rpc error: code = NotFound desc = could not find container \"a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca\": container with ID starting with a63bca6d90f5b1fd4f2b0e23a4d959f984bc656c936ef04ce7fd7469115faeca not found: ID does not exist" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.527335 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities\") pod \"956f3abe-0e48-45c8-88d3-52632967ccf1\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.527418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content\") pod \"956f3abe-0e48-45c8-88d3-52632967ccf1\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.527613 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sgff\" (UniqueName: \"kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff\") pod \"956f3abe-0e48-45c8-88d3-52632967ccf1\" (UID: \"956f3abe-0e48-45c8-88d3-52632967ccf1\") " Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.528120 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities" (OuterVolumeSpecName: "utilities") pod "956f3abe-0e48-45c8-88d3-52632967ccf1" (UID: "956f3abe-0e48-45c8-88d3-52632967ccf1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.533257 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff" (OuterVolumeSpecName: "kube-api-access-8sgff") pod "956f3abe-0e48-45c8-88d3-52632967ccf1" (UID: "956f3abe-0e48-45c8-88d3-52632967ccf1"). InnerVolumeSpecName "kube-api-access-8sgff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.582981 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956f3abe-0e48-45c8-88d3-52632967ccf1" (UID: "956f3abe-0e48-45c8-88d3-52632967ccf1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.630040 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sgff\" (UniqueName: \"kubernetes.io/projected/956f3abe-0e48-45c8-88d3-52632967ccf1-kube-api-access-8sgff\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.630088 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.630100 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956f3abe-0e48-45c8-88d3-52632967ccf1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.771250 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:38 crc kubenswrapper[4895]: I1202 07:52:38.778478 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gjtf4"] Dec 02 07:52:39 crc kubenswrapper[4895]: I1202 07:52:39.150657 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" path="/var/lib/kubelet/pods/956f3abe-0e48-45c8-88d3-52632967ccf1/volumes" Dec 02 07:52:49 crc kubenswrapper[4895]: I1202 07:52:49.145086 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:52:49 crc kubenswrapper[4895]: E1202 07:52:49.145789 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:53:01 crc kubenswrapper[4895]: I1202 07:53:01.142252 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:53:01 crc kubenswrapper[4895]: E1202 07:53:01.143656 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:53:13 crc kubenswrapper[4895]: I1202 07:53:13.141832 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:53:13 crc kubenswrapper[4895]: E1202 07:53:13.142985 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.550066 4895 scope.go:117] "RemoveContainer" containerID="8a759c6911d74b1f1d0481259494f6c68447fd16d6ae68cb602fe2ebde521347" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.579287 4895 scope.go:117] "RemoveContainer" containerID="b1a1c5160c9558d73203d10d42adf88f8ff05038c8885766513269062a2ce0c0" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.627705 4895 scope.go:117] "RemoveContainer" containerID="6ca835b0b75e3696527a82637f8aa060b70b3d711663ec42e2f269ae07704a6b" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.659547 4895 scope.go:117] "RemoveContainer" containerID="6ef5e37085909aad803297f2e65887d60d3b2a7265ec5b2edec0715738e2c133" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.680547 4895 scope.go:117] "RemoveContainer" containerID="b0520efbfddb0b37fbb7a65afbe9383817ccf9f6ae11082d2fa3e2c3a88b743f" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.706359 4895 scope.go:117] "RemoveContainer" containerID="391b022ce7caf5397988776e4babbf6233f29f15977989bdaf58d428c0573bde" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.762506 4895 scope.go:117] "RemoveContainer" containerID="62b374c4faec2438d9bd41034a5738b43a2bf2fbe98618d82d77b24eeb955851" Dec 02 07:53:26 crc kubenswrapper[4895]: I1202 07:53:26.826486 4895 scope.go:117] "RemoveContainer" containerID="45d9908b6c5cd875b205c3155ba480192c4dc6d4df37c9c88146d86fdf68c7e6" Dec 02 07:53:28 crc kubenswrapper[4895]: I1202 07:53:28.140837 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:53:28 crc kubenswrapper[4895]: E1202 07:53:28.141431 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:53:40 crc kubenswrapper[4895]: I1202 07:53:40.142572 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:53:40 crc kubenswrapper[4895]: E1202 07:53:40.143687 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:53:52 crc kubenswrapper[4895]: I1202 07:53:52.141389 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:53:52 crc kubenswrapper[4895]: E1202 07:53:52.143070 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:54:03 crc kubenswrapper[4895]: I1202 07:54:03.141886 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:54:03 crc kubenswrapper[4895]: E1202 07:54:03.143094 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 07:54:18 crc kubenswrapper[4895]: I1202 07:54:18.141339 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:54:18 crc kubenswrapper[4895]: I1202 07:54:18.336791 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2"} Dec 02 07:54:26 crc kubenswrapper[4895]: I1202 07:54:26.952215 4895 scope.go:117] "RemoveContainer" containerID="fc3f6c60c5579c4ea14cacedf2c65b1a8d013562a1bf58b34c7f59f9c0b1bdbe" Dec 02 07:54:26 crc kubenswrapper[4895]: I1202 07:54:26.992970 4895 scope.go:117] "RemoveContainer" containerID="c3e6869faec7cbcaa9cf951bb70a1cabb5f7864bf03d30d0b3bda1d511028024" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.066678 4895 scope.go:117] "RemoveContainer" containerID="21f6d09bc2b80b8035a54dfa404bb01cbc6de2843d53dca435681f4b45dafd2f" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.105364 4895 scope.go:117] "RemoveContainer" containerID="1581cb4c4b70dcc4008550020a88177eb72fd5b2057dc2f0204082b9090480c2" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.138399 4895 scope.go:117] "RemoveContainer" containerID="a18e722962390f2024c510ade1f26e4551f58f4c4c7c9b941662a44001c505ea" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.160660 4895 scope.go:117] "RemoveContainer" containerID="7dc2853c20a38045953efd3752aa502543cbbe08dd450481c9d49ada9a7e28ab" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.184378 4895 scope.go:117] "RemoveContainer" containerID="e21126490e30d0f2abdaa9c6468d800825eee8d11c80c4baee6ce5e501917408" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.212122 4895 scope.go:117] "RemoveContainer" containerID="c51c9cadb8af000c2a708fd441d7a16102397aef9d4301d9ddb87d8386fc6024" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.235505 4895 scope.go:117] "RemoveContainer" containerID="24c551cd8bbb34832b5693a91b97f7fc6d801619091d62b54a02c1b5f9bcbd45" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.255626 4895 scope.go:117] "RemoveContainer" containerID="7a9cd5ea2cd01d61f6bb76eff238c970ae03c6d9c57bfc437465a95ac614529c" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.280615 4895 scope.go:117] "RemoveContainer" containerID="cb9866d7f2171a1626ecf3c4140a850dff5554a37f5e78b53e02cd154e5fe2d5" Dec 02 07:54:27 crc kubenswrapper[4895]: I1202 07:54:27.313151 4895 scope.go:117] "RemoveContainer" containerID="9b1129b02fb76389880616b0a4f07ba64c625c630960d277f41251bdc884c35b" Dec 02 07:55:27 crc kubenswrapper[4895]: I1202 07:55:27.467216 4895 scope.go:117] "RemoveContainer" containerID="886e593440f5f547e624e04c372422144b2af46990afd1b6c63c56f2dacb354f" Dec 02 07:55:27 crc kubenswrapper[4895]: I1202 07:55:27.497160 4895 scope.go:117] "RemoveContainer" containerID="9feda8f6a8375fc053369762aade6135c47ba96ba66d70d924ce2840072589a8" Dec 02 07:55:27 crc kubenswrapper[4895]: I1202 07:55:27.519262 4895 scope.go:117] "RemoveContainer" containerID="ef6521d9b2bbb3f545c7a699a0a97be4fbbe9fbc46696c6a6287b9e2ee4ce0a8" Dec 02 07:55:27 crc kubenswrapper[4895]: I1202 07:55:27.562089 4895 scope.go:117] "RemoveContainer" containerID="f9caf1a101a5817e21ca4e677736eb4472cb7fca73f00ec0f90330253ace0248" Dec 02 07:55:27 crc kubenswrapper[4895]: I1202 07:55:27.587620 4895 scope.go:117] "RemoveContainer" containerID="b69fdaed99708c1282b6d6c9ebd2f76e8972907d394090d23baa40f074e3ff2c" Dec 02 07:56:35 crc kubenswrapper[4895]: I1202 07:56:35.474044 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:56:35 crc kubenswrapper[4895]: I1202 07:56:35.474658 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:57:05 crc kubenswrapper[4895]: I1202 07:57:05.474685 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:57:05 crc kubenswrapper[4895]: I1202 07:57:05.475148 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:57:35 crc kubenswrapper[4895]: I1202 07:57:35.473105 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:57:35 crc kubenswrapper[4895]: I1202 07:57:35.473710 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:57:35 crc kubenswrapper[4895]: I1202 07:57:35.473790 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 07:57:35 crc kubenswrapper[4895]: I1202 07:57:35.474612 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:57:35 crc kubenswrapper[4895]: I1202 07:57:35.474677 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2" gracePeriod=600 Dec 02 07:57:36 crc kubenswrapper[4895]: I1202 07:57:36.141175 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2" exitCode=0 Dec 02 07:57:36 crc kubenswrapper[4895]: I1202 07:57:36.141250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2"} Dec 02 07:57:36 crc kubenswrapper[4895]: I1202 07:57:36.141541 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139"} Dec 02 07:57:36 crc kubenswrapper[4895]: I1202 07:57:36.141562 4895 scope.go:117] "RemoveContainer" containerID="9934068f902577cab2b9f5b749fcea3de9f9afb4c928af19ac4dbad51ebdaa03" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.702488 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:42 crc kubenswrapper[4895]: E1202 07:57:42.706134 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="extract-utilities" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.706159 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="extract-utilities" Dec 02 07:57:42 crc kubenswrapper[4895]: E1202 07:57:42.706176 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="registry-server" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.706185 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="registry-server" Dec 02 07:57:42 crc kubenswrapper[4895]: E1202 07:57:42.706198 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="extract-content" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.706207 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="extract-content" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.706451 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="956f3abe-0e48-45c8-88d3-52632967ccf1" containerName="registry-server" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.708967 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.718422 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.750839 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfln\" (UniqueName: \"kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.750995 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.751094 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.852325 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfln\" (UniqueName: \"kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.852429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.852501 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.853012 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.853179 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:42 crc kubenswrapper[4895]: I1202 07:57:42.877584 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfln\" (UniqueName: \"kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln\") pod \"redhat-operators-2sspq\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:43 crc kubenswrapper[4895]: I1202 07:57:43.063762 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:43 crc kubenswrapper[4895]: I1202 07:57:43.523476 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:44 crc kubenswrapper[4895]: I1202 07:57:44.212074 4895 generic.go:334] "Generic (PLEG): container finished" podID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerID="560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188" exitCode=0 Dec 02 07:57:44 crc kubenswrapper[4895]: I1202 07:57:44.212137 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerDied","Data":"560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188"} Dec 02 07:57:44 crc kubenswrapper[4895]: I1202 07:57:44.212378 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerStarted","Data":"0eb2c05230b69e3dfe70fe5be7d62ba53cd3bab874687955e528ddbee39bfc10"} Dec 02 07:57:44 crc kubenswrapper[4895]: I1202 07:57:44.213797 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:57:46 crc kubenswrapper[4895]: I1202 07:57:46.231449 4895 generic.go:334] "Generic (PLEG): container finished" podID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerID="5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20" exitCode=0 Dec 02 07:57:46 crc kubenswrapper[4895]: I1202 07:57:46.231507 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerDied","Data":"5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20"} Dec 02 07:57:47 crc kubenswrapper[4895]: I1202 07:57:47.243412 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerStarted","Data":"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43"} Dec 02 07:57:47 crc kubenswrapper[4895]: I1202 07:57:47.263026 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2sspq" podStartSLOduration=2.460284695 podStartE2EDuration="5.263007001s" podCreationTimestamp="2025-12-02 07:57:42 +0000 UTC" firstStartedPulling="2025-12-02 07:57:44.213516175 +0000 UTC m=+2075.384375788" lastFinishedPulling="2025-12-02 07:57:47.016238451 +0000 UTC m=+2078.187098094" observedRunningTime="2025-12-02 07:57:47.26072574 +0000 UTC m=+2078.431585373" watchObservedRunningTime="2025-12-02 07:57:47.263007001 +0000 UTC m=+2078.433866614" Dec 02 07:57:53 crc kubenswrapper[4895]: I1202 07:57:53.064904 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:53 crc kubenswrapper[4895]: I1202 07:57:53.065638 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:53 crc kubenswrapper[4895]: I1202 07:57:53.118365 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:53 crc kubenswrapper[4895]: I1202 07:57:53.348938 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:53 crc kubenswrapper[4895]: I1202 07:57:53.408304 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.314511 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2sspq" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="registry-server" containerID="cri-o://19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43" gracePeriod=2 Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.719329 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.862340 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvfln\" (UniqueName: \"kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln\") pod \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.862441 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities\") pod \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.862477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content\") pod \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\" (UID: \"a34e0e85-3063-4a2c-a763-e34ed502f6c6\") " Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.863273 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities" (OuterVolumeSpecName: "utilities") pod "a34e0e85-3063-4a2c-a763-e34ed502f6c6" (UID: "a34e0e85-3063-4a2c-a763-e34ed502f6c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.869965 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln" (OuterVolumeSpecName: "kube-api-access-tvfln") pod "a34e0e85-3063-4a2c-a763-e34ed502f6c6" (UID: "a34e0e85-3063-4a2c-a763-e34ed502f6c6"). InnerVolumeSpecName "kube-api-access-tvfln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.964364 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvfln\" (UniqueName: \"kubernetes.io/projected/a34e0e85-3063-4a2c-a763-e34ed502f6c6-kube-api-access-tvfln\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:55 crc kubenswrapper[4895]: I1202 07:57:55.964395 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.327375 4895 generic.go:334] "Generic (PLEG): container finished" podID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerID="19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43" exitCode=0 Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.327448 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2sspq" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.327497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerDied","Data":"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43"} Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.329145 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2sspq" event={"ID":"a34e0e85-3063-4a2c-a763-e34ed502f6c6","Type":"ContainerDied","Data":"0eb2c05230b69e3dfe70fe5be7d62ba53cd3bab874687955e528ddbee39bfc10"} Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.329181 4895 scope.go:117] "RemoveContainer" containerID="19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.352415 4895 scope.go:117] "RemoveContainer" containerID="5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.374012 4895 scope.go:117] "RemoveContainer" containerID="560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.413395 4895 scope.go:117] "RemoveContainer" containerID="19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43" Dec 02 07:57:56 crc kubenswrapper[4895]: E1202 07:57:56.414139 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43\": container with ID starting with 19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43 not found: ID does not exist" containerID="19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.414204 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43"} err="failed to get container status \"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43\": rpc error: code = NotFound desc = could not find container \"19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43\": container with ID starting with 19f02892ec3ff7a0b48ff1375e05cc49f8901c64fd996e8e01a9573eca072d43 not found: ID does not exist" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.414244 4895 scope.go:117] "RemoveContainer" containerID="5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20" Dec 02 07:57:56 crc kubenswrapper[4895]: E1202 07:57:56.414835 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20\": container with ID starting with 5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20 not found: ID does not exist" containerID="5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.414880 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20"} err="failed to get container status \"5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20\": rpc error: code = NotFound desc = could not find container \"5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20\": container with ID starting with 5cabb404726c08ce53b547b50e5261804455b18f58921de1753455f5d27d1f20 not found: ID does not exist" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.414917 4895 scope.go:117] "RemoveContainer" containerID="560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188" Dec 02 07:57:56 crc kubenswrapper[4895]: E1202 07:57:56.415365 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188\": container with ID starting with 560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188 not found: ID does not exist" containerID="560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.415395 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188"} err="failed to get container status \"560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188\": rpc error: code = NotFound desc = could not find container \"560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188\": container with ID starting with 560b6327efea7d828c93e7112d8993b9d34f7ccd8732de90efadd43153b6d188 not found: ID does not exist" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.885146 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a34e0e85-3063-4a2c-a763-e34ed502f6c6" (UID: "a34e0e85-3063-4a2c-a763-e34ed502f6c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.968834 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.974550 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2sspq"] Dec 02 07:57:56 crc kubenswrapper[4895]: I1202 07:57:56.980415 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34e0e85-3063-4a2c-a763-e34ed502f6c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:57 crc kubenswrapper[4895]: I1202 07:57:57.150648 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" path="/var/lib/kubelet/pods/a34e0e85-3063-4a2c-a763-e34ed502f6c6/volumes" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.391675 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:58:50 crc kubenswrapper[4895]: E1202 07:58:50.393007 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="registry-server" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.393029 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="registry-server" Dec 02 07:58:50 crc kubenswrapper[4895]: E1202 07:58:50.393049 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="extract-content" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.393059 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="extract-content" Dec 02 07:58:50 crc kubenswrapper[4895]: E1202 07:58:50.393112 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="extract-utilities" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.393130 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="extract-utilities" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.393554 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34e0e85-3063-4a2c-a763-e34ed502f6c6" containerName="registry-server" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.395225 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.401700 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.566149 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5tr\" (UniqueName: \"kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.566297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.566336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.667525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.667574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.667632 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5tr\" (UniqueName: \"kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.668429 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.668676 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.692456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5tr\" (UniqueName: \"kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr\") pod \"redhat-marketplace-x5d7f\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:50 crc kubenswrapper[4895]: I1202 07:58:50.733878 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:58:51 crc kubenswrapper[4895]: I1202 07:58:51.213838 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:58:51 crc kubenswrapper[4895]: I1202 07:58:51.761281 4895 generic.go:334] "Generic (PLEG): container finished" podID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerID="c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6" exitCode=0 Dec 02 07:58:51 crc kubenswrapper[4895]: I1202 07:58:51.761426 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerDied","Data":"c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6"} Dec 02 07:58:51 crc kubenswrapper[4895]: I1202 07:58:51.761500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerStarted","Data":"ebfe1750de91713db5eeff5188d737def929f2342e555ce3f87bfd7a0b2b7aad"} Dec 02 07:58:53 crc kubenswrapper[4895]: I1202 07:58:53.778263 4895 generic.go:334] "Generic (PLEG): container finished" podID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerID="e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9" exitCode=0 Dec 02 07:58:53 crc kubenswrapper[4895]: I1202 07:58:53.778565 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerDied","Data":"e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9"} Dec 02 07:58:54 crc kubenswrapper[4895]: I1202 07:58:54.791657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerStarted","Data":"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33"} Dec 02 07:58:54 crc kubenswrapper[4895]: I1202 07:58:54.817544 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x5d7f" podStartSLOduration=2.247211201 podStartE2EDuration="4.817510542s" podCreationTimestamp="2025-12-02 07:58:50 +0000 UTC" firstStartedPulling="2025-12-02 07:58:51.767534551 +0000 UTC m=+2142.938394204" lastFinishedPulling="2025-12-02 07:58:54.337833932 +0000 UTC m=+2145.508693545" observedRunningTime="2025-12-02 07:58:54.809214364 +0000 UTC m=+2145.980073977" watchObservedRunningTime="2025-12-02 07:58:54.817510542 +0000 UTC m=+2145.988370155" Dec 02 07:59:00 crc kubenswrapper[4895]: I1202 07:59:00.734012 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:00 crc kubenswrapper[4895]: I1202 07:59:00.734545 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:00 crc kubenswrapper[4895]: I1202 07:59:00.778124 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:00 crc kubenswrapper[4895]: I1202 07:59:00.884887 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:01 crc kubenswrapper[4895]: I1202 07:59:01.012702 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:59:02 crc kubenswrapper[4895]: I1202 07:59:02.862217 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x5d7f" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="registry-server" containerID="cri-o://4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33" gracePeriod=2 Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.779132 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.796557 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content\") pod \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.796626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5tr\" (UniqueName: \"kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr\") pod \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.796688 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities\") pod \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\" (UID: \"df7e25b8-587a-4fe4-b33c-96dfc34f3e24\") " Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.797691 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities" (OuterVolumeSpecName: "utilities") pod "df7e25b8-587a-4fe4-b33c-96dfc34f3e24" (UID: "df7e25b8-587a-4fe4-b33c-96dfc34f3e24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.805333 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr" (OuterVolumeSpecName: "kube-api-access-6f5tr") pod "df7e25b8-587a-4fe4-b33c-96dfc34f3e24" (UID: "df7e25b8-587a-4fe4-b33c-96dfc34f3e24"). InnerVolumeSpecName "kube-api-access-6f5tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.819855 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df7e25b8-587a-4fe4-b33c-96dfc34f3e24" (UID: "df7e25b8-587a-4fe4-b33c-96dfc34f3e24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.871614 4895 generic.go:334] "Generic (PLEG): container finished" podID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerID="4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33" exitCode=0 Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.871666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerDied","Data":"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33"} Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.871696 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5d7f" event={"ID":"df7e25b8-587a-4fe4-b33c-96dfc34f3e24","Type":"ContainerDied","Data":"ebfe1750de91713db5eeff5188d737def929f2342e555ce3f87bfd7a0b2b7aad"} Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.871716 4895 scope.go:117] "RemoveContainer" containerID="4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.871851 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5d7f" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.891249 4895 scope.go:117] "RemoveContainer" containerID="e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.898873 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.898912 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f5tr\" (UniqueName: \"kubernetes.io/projected/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-kube-api-access-6f5tr\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.898928 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7e25b8-587a-4fe4-b33c-96dfc34f3e24-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.903390 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.915317 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5d7f"] Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.916220 4895 scope.go:117] "RemoveContainer" containerID="c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.943842 4895 scope.go:117] "RemoveContainer" containerID="4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33" Dec 02 07:59:03 crc kubenswrapper[4895]: E1202 07:59:03.944394 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33\": container with ID starting with 4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33 not found: ID does not exist" containerID="4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.944444 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33"} err="failed to get container status \"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33\": rpc error: code = NotFound desc = could not find container \"4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33\": container with ID starting with 4b92f0c5702674d4dd4c2efff22e9a1c41bd87971e201abacaa076e14afafb33 not found: ID does not exist" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.944476 4895 scope.go:117] "RemoveContainer" containerID="e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9" Dec 02 07:59:03 crc kubenswrapper[4895]: E1202 07:59:03.944826 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9\": container with ID starting with e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9 not found: ID does not exist" containerID="e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.944874 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9"} err="failed to get container status \"e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9\": rpc error: code = NotFound desc = could not find container \"e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9\": container with ID starting with e0c1b34498d41bb1d1ae17bd0450eb690e3d2ab2a0607c2be1dbacf675df56b9 not found: ID does not exist" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.944909 4895 scope.go:117] "RemoveContainer" containerID="c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6" Dec 02 07:59:03 crc kubenswrapper[4895]: E1202 07:59:03.945237 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6\": container with ID starting with c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6 not found: ID does not exist" containerID="c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6" Dec 02 07:59:03 crc kubenswrapper[4895]: I1202 07:59:03.945291 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6"} err="failed to get container status \"c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6\": rpc error: code = NotFound desc = could not find container \"c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6\": container with ID starting with c2c6d4c1b63c403dcaa5b7192645c8bba30e4556ab5a848f67a98e24f5dbc0f6 not found: ID does not exist" Dec 02 07:59:05 crc kubenswrapper[4895]: I1202 07:59:05.151273 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" path="/var/lib/kubelet/pods/df7e25b8-587a-4fe4-b33c-96dfc34f3e24/volumes" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.401283 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:15 crc kubenswrapper[4895]: E1202 07:59:15.402965 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="extract-content" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.402987 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="extract-content" Dec 02 07:59:15 crc kubenswrapper[4895]: E1202 07:59:15.403012 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="extract-utilities" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.403022 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="extract-utilities" Dec 02 07:59:15 crc kubenswrapper[4895]: E1202 07:59:15.403051 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="registry-server" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.403068 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="registry-server" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.403280 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7e25b8-587a-4fe4-b33c-96dfc34f3e24" containerName="registry-server" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.405194 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.412721 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.483483 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.483569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc5pt\" (UniqueName: \"kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.483610 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.584931 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.585015 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc5pt\" (UniqueName: \"kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.585050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.585619 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.585776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.608080 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc5pt\" (UniqueName: \"kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt\") pod \"certified-operators-r777k\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:15 crc kubenswrapper[4895]: I1202 07:59:15.740211 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:16 crc kubenswrapper[4895]: I1202 07:59:16.047566 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:16 crc kubenswrapper[4895]: I1202 07:59:16.999579 4895 generic.go:334] "Generic (PLEG): container finished" podID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerID="a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae" exitCode=0 Dec 02 07:59:16 crc kubenswrapper[4895]: I1202 07:59:16.999678 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerDied","Data":"a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae"} Dec 02 07:59:17 crc kubenswrapper[4895]: I1202 07:59:16.999892 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerStarted","Data":"8eaedaedb2e92b9ed3a0537fe96cbc62b0058f9def5fae3d1d0019e0d60d62dd"} Dec 02 07:59:19 crc kubenswrapper[4895]: I1202 07:59:19.019935 4895 generic.go:334] "Generic (PLEG): container finished" podID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerID="e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2" exitCode=0 Dec 02 07:59:19 crc kubenswrapper[4895]: I1202 07:59:19.020017 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerDied","Data":"e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2"} Dec 02 07:59:21 crc kubenswrapper[4895]: I1202 07:59:21.042188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerStarted","Data":"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9"} Dec 02 07:59:21 crc kubenswrapper[4895]: I1202 07:59:21.065783 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r777k" podStartSLOduration=2.851440605 podStartE2EDuration="6.065763885s" podCreationTimestamp="2025-12-02 07:59:15 +0000 UTC" firstStartedPulling="2025-12-02 07:59:17.002635282 +0000 UTC m=+2168.173494925" lastFinishedPulling="2025-12-02 07:59:20.216958592 +0000 UTC m=+2171.387818205" observedRunningTime="2025-12-02 07:59:21.061702428 +0000 UTC m=+2172.232562041" watchObservedRunningTime="2025-12-02 07:59:21.065763885 +0000 UTC m=+2172.236623498" Dec 02 07:59:25 crc kubenswrapper[4895]: I1202 07:59:25.741057 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:25 crc kubenswrapper[4895]: I1202 07:59:25.741327 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:25 crc kubenswrapper[4895]: I1202 07:59:25.787767 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:26 crc kubenswrapper[4895]: I1202 07:59:26.145710 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:26 crc kubenswrapper[4895]: I1202 07:59:26.197049 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:28 crc kubenswrapper[4895]: I1202 07:59:28.094577 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r777k" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="registry-server" containerID="cri-o://ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9" gracePeriod=2 Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.032210 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.037483 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities\") pod \"acfff40c-249b-49a0-8f2e-4120c1e90628\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.037624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content\") pod \"acfff40c-249b-49a0-8f2e-4120c1e90628\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.037672 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc5pt\" (UniqueName: \"kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt\") pod \"acfff40c-249b-49a0-8f2e-4120c1e90628\" (UID: \"acfff40c-249b-49a0-8f2e-4120c1e90628\") " Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.038958 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities" (OuterVolumeSpecName: "utilities") pod "acfff40c-249b-49a0-8f2e-4120c1e90628" (UID: "acfff40c-249b-49a0-8f2e-4120c1e90628"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.047394 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt" (OuterVolumeSpecName: "kube-api-access-sc5pt") pod "acfff40c-249b-49a0-8f2e-4120c1e90628" (UID: "acfff40c-249b-49a0-8f2e-4120c1e90628"). InnerVolumeSpecName "kube-api-access-sc5pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.103932 4895 generic.go:334] "Generic (PLEG): container finished" podID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerID="ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9" exitCode=0 Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.104001 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerDied","Data":"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9"} Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.104006 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r777k" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.104053 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r777k" event={"ID":"acfff40c-249b-49a0-8f2e-4120c1e90628","Type":"ContainerDied","Data":"8eaedaedb2e92b9ed3a0537fe96cbc62b0058f9def5fae3d1d0019e0d60d62dd"} Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.104088 4895 scope.go:117] "RemoveContainer" containerID="ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.123703 4895 scope.go:117] "RemoveContainer" containerID="e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.138374 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.138404 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc5pt\" (UniqueName: \"kubernetes.io/projected/acfff40c-249b-49a0-8f2e-4120c1e90628-kube-api-access-sc5pt\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.152675 4895 scope.go:117] "RemoveContainer" containerID="a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.175424 4895 scope.go:117] "RemoveContainer" containerID="ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9" Dec 02 07:59:29 crc kubenswrapper[4895]: E1202 07:59:29.175901 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9\": container with ID starting with ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9 not found: ID does not exist" containerID="ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.175935 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9"} err="failed to get container status \"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9\": rpc error: code = NotFound desc = could not find container \"ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9\": container with ID starting with ea565f849c013b678b374bf8e57fb89fa1a80baccb7f791fe2351cb06ab3a6a9 not found: ID does not exist" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.175968 4895 scope.go:117] "RemoveContainer" containerID="e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2" Dec 02 07:59:29 crc kubenswrapper[4895]: E1202 07:59:29.176218 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2\": container with ID starting with e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2 not found: ID does not exist" containerID="e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.176260 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2"} err="failed to get container status \"e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2\": rpc error: code = NotFound desc = could not find container \"e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2\": container with ID starting with e3cf3769287b95a0a56d0107bb9a000b91cde571204f05095125eb87ed8864b2 not found: ID does not exist" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.176278 4895 scope.go:117] "RemoveContainer" containerID="a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae" Dec 02 07:59:29 crc kubenswrapper[4895]: E1202 07:59:29.176523 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae\": container with ID starting with a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae not found: ID does not exist" containerID="a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.176550 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae"} err="failed to get container status \"a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae\": rpc error: code = NotFound desc = could not find container \"a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae\": container with ID starting with a1d7526dd0058e5824d17dac33cd94ef04a742ea0ef49775255a0b5b60f004ae not found: ID does not exist" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.250902 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acfff40c-249b-49a0-8f2e-4120c1e90628" (UID: "acfff40c-249b-49a0-8f2e-4120c1e90628"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.341658 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acfff40c-249b-49a0-8f2e-4120c1e90628-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.451096 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:29 crc kubenswrapper[4895]: I1202 07:59:29.457016 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r777k"] Dec 02 07:59:31 crc kubenswrapper[4895]: I1202 07:59:31.154516 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" path="/var/lib/kubelet/pods/acfff40c-249b-49a0-8f2e-4120c1e90628/volumes" Dec 02 07:59:35 crc kubenswrapper[4895]: I1202 07:59:35.474105 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:59:35 crc kubenswrapper[4895]: I1202 07:59:35.476243 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.152490 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv"] Dec 02 08:00:00 crc kubenswrapper[4895]: E1202 08:00:00.153977 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="registry-server" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.153996 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="registry-server" Dec 02 08:00:00 crc kubenswrapper[4895]: E1202 08:00:00.154013 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="extract-utilities" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.154021 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="extract-utilities" Dec 02 08:00:00 crc kubenswrapper[4895]: E1202 08:00:00.154034 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="extract-content" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.154041 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="extract-content" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.154230 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfff40c-249b-49a0-8f2e-4120c1e90628" containerName="registry-server" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.154944 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.158925 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.159695 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.175502 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv"] Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.237207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.237270 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.237326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69dd\" (UniqueName: \"kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.338848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.338901 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.338923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69dd\" (UniqueName: \"kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.340561 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.345509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.358703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69dd\" (UniqueName: \"kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd\") pod \"collect-profiles-29411040-qskgv\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.475835 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.688599 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv"] Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.951124 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" event={"ID":"c5fb87c2-8e73-495e-afdf-a2886910d986","Type":"ContainerStarted","Data":"fb2380bbefb8ea9c516cbef089e781cf7406c97d9c8b1f40d62d00c034d7b125"} Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.951627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" event={"ID":"c5fb87c2-8e73-495e-afdf-a2886910d986","Type":"ContainerStarted","Data":"892cf99ab5835c3c16728ee06406b1bce940230f7aff82add0e4f391b56e3070"} Dec 02 08:00:00 crc kubenswrapper[4895]: I1202 08:00:00.974245 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" podStartSLOduration=0.974214934 podStartE2EDuration="974.214934ms" podCreationTimestamp="2025-12-02 08:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:00:00.966083131 +0000 UTC m=+2212.136942744" watchObservedRunningTime="2025-12-02 08:00:00.974214934 +0000 UTC m=+2212.145074547" Dec 02 08:00:01 crc kubenswrapper[4895]: I1202 08:00:01.961456 4895 generic.go:334] "Generic (PLEG): container finished" podID="c5fb87c2-8e73-495e-afdf-a2886910d986" containerID="fb2380bbefb8ea9c516cbef089e781cf7406c97d9c8b1f40d62d00c034d7b125" exitCode=0 Dec 02 08:00:01 crc kubenswrapper[4895]: I1202 08:00:01.961530 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" event={"ID":"c5fb87c2-8e73-495e-afdf-a2886910d986","Type":"ContainerDied","Data":"fb2380bbefb8ea9c516cbef089e781cf7406c97d9c8b1f40d62d00c034d7b125"} Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.240009 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.385124 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z69dd\" (UniqueName: \"kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd\") pod \"c5fb87c2-8e73-495e-afdf-a2886910d986\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.385217 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume\") pod \"c5fb87c2-8e73-495e-afdf-a2886910d986\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.385254 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume\") pod \"c5fb87c2-8e73-495e-afdf-a2886910d986\" (UID: \"c5fb87c2-8e73-495e-afdf-a2886910d986\") " Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.386114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5fb87c2-8e73-495e-afdf-a2886910d986" (UID: "c5fb87c2-8e73-495e-afdf-a2886910d986"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.390802 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5fb87c2-8e73-495e-afdf-a2886910d986" (UID: "c5fb87c2-8e73-495e-afdf-a2886910d986"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.392532 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd" (OuterVolumeSpecName: "kube-api-access-z69dd") pod "c5fb87c2-8e73-495e-afdf-a2886910d986" (UID: "c5fb87c2-8e73-495e-afdf-a2886910d986"). InnerVolumeSpecName "kube-api-access-z69dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.487422 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5fb87c2-8e73-495e-afdf-a2886910d986-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.487491 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z69dd\" (UniqueName: \"kubernetes.io/projected/c5fb87c2-8e73-495e-afdf-a2886910d986-kube-api-access-z69dd\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.487501 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5fb87c2-8e73-495e-afdf-a2886910d986-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.976927 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" event={"ID":"c5fb87c2-8e73-495e-afdf-a2886910d986","Type":"ContainerDied","Data":"892cf99ab5835c3c16728ee06406b1bce940230f7aff82add0e4f391b56e3070"} Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.976978 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892cf99ab5835c3c16728ee06406b1bce940230f7aff82add0e4f391b56e3070" Dec 02 08:00:03 crc kubenswrapper[4895]: I1202 08:00:03.976990 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv" Dec 02 08:00:04 crc kubenswrapper[4895]: I1202 08:00:04.308780 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h"] Dec 02 08:00:04 crc kubenswrapper[4895]: I1202 08:00:04.315150 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410995-cbx2h"] Dec 02 08:00:05 crc kubenswrapper[4895]: I1202 08:00:05.149016 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8f451a-ac50-4e0a-bf8e-e6d505305177" path="/var/lib/kubelet/pods/7e8f451a-ac50-4e0a-bf8e-e6d505305177/volumes" Dec 02 08:00:05 crc kubenswrapper[4895]: I1202 08:00:05.474230 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:00:05 crc kubenswrapper[4895]: I1202 08:00:05.474330 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:00:27 crc kubenswrapper[4895]: I1202 08:00:27.782846 4895 scope.go:117] "RemoveContainer" containerID="9c4150dad8ac9bf277c6659f2996173dcea7cf6a3077852fb2e4dea67dec1703" Dec 02 08:00:35 crc kubenswrapper[4895]: I1202 08:00:35.473664 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:00:35 crc kubenswrapper[4895]: I1202 08:00:35.474268 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:00:35 crc kubenswrapper[4895]: I1202 08:00:35.474349 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:00:35 crc kubenswrapper[4895]: I1202 08:00:35.475152 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:00:35 crc kubenswrapper[4895]: I1202 08:00:35.475226 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" gracePeriod=600 Dec 02 08:00:35 crc kubenswrapper[4895]: E1202 08:00:35.599609 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:00:36 crc kubenswrapper[4895]: I1202 08:00:36.274645 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" exitCode=0 Dec 02 08:00:36 crc kubenswrapper[4895]: I1202 08:00:36.274696 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139"} Dec 02 08:00:36 crc kubenswrapper[4895]: I1202 08:00:36.274794 4895 scope.go:117] "RemoveContainer" containerID="3c44d318ec461337d9eed3b738b223965ab15c013c5374cf7d1e76d7977871d2" Dec 02 08:00:36 crc kubenswrapper[4895]: I1202 08:00:36.275457 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:00:36 crc kubenswrapper[4895]: E1202 08:00:36.275796 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:00:48 crc kubenswrapper[4895]: I1202 08:00:48.141922 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:00:48 crc kubenswrapper[4895]: E1202 08:00:48.143360 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:01:03 crc kubenswrapper[4895]: I1202 08:01:03.141771 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:01:03 crc kubenswrapper[4895]: E1202 08:01:03.142547 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:01:14 crc kubenswrapper[4895]: I1202 08:01:14.140794 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:01:14 crc kubenswrapper[4895]: E1202 08:01:14.141863 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:01:28 crc kubenswrapper[4895]: I1202 08:01:28.141102 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:01:28 crc kubenswrapper[4895]: E1202 08:01:28.142445 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:01:40 crc kubenswrapper[4895]: I1202 08:01:40.141092 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:01:40 crc kubenswrapper[4895]: E1202 08:01:40.142850 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:01:51 crc kubenswrapper[4895]: I1202 08:01:51.141755 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:01:51 crc kubenswrapper[4895]: E1202 08:01:51.144333 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:02:03 crc kubenswrapper[4895]: I1202 08:02:03.141936 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:02:03 crc kubenswrapper[4895]: E1202 08:02:03.142893 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:02:14 crc kubenswrapper[4895]: I1202 08:02:14.141480 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:02:14 crc kubenswrapper[4895]: E1202 08:02:14.143306 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:02:27 crc kubenswrapper[4895]: I1202 08:02:27.141538 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:02:27 crc kubenswrapper[4895]: E1202 08:02:27.142296 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:02:39 crc kubenswrapper[4895]: I1202 08:02:39.146708 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:02:39 crc kubenswrapper[4895]: E1202 08:02:39.149335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:02:50 crc kubenswrapper[4895]: I1202 08:02:50.141834 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:02:50 crc kubenswrapper[4895]: E1202 08:02:50.142970 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:03:02 crc kubenswrapper[4895]: I1202 08:03:02.141600 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:03:02 crc kubenswrapper[4895]: E1202 08:03:02.142335 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:03:15 crc kubenswrapper[4895]: I1202 08:03:15.141471 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:03:15 crc kubenswrapper[4895]: E1202 08:03:15.143618 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:03:27 crc kubenswrapper[4895]: I1202 08:03:27.141059 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:03:27 crc kubenswrapper[4895]: E1202 08:03:27.142196 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.396278 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:33 crc kubenswrapper[4895]: E1202 08:03:33.397104 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fb87c2-8e73-495e-afdf-a2886910d986" containerName="collect-profiles" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.397122 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fb87c2-8e73-495e-afdf-a2886910d986" containerName="collect-profiles" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.397349 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fb87c2-8e73-495e-afdf-a2886910d986" containerName="collect-profiles" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.398670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.420544 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.449125 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.449235 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.449445 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6sz\" (UniqueName: \"kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.551004 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.551399 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.551697 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.551910 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.552883 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6sz\" (UniqueName: \"kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.572658 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6sz\" (UniqueName: \"kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz\") pod \"community-operators-vqj2s\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:33 crc kubenswrapper[4895]: I1202 08:03:33.752178 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:34 crc kubenswrapper[4895]: I1202 08:03:34.075511 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:34 crc kubenswrapper[4895]: I1202 08:03:34.908025 4895 generic.go:334] "Generic (PLEG): container finished" podID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerID="d5569a8dc5707a03fbffce20aba3850b0ce2306f264a17617bbc67ca353dd039" exitCode=0 Dec 02 08:03:34 crc kubenswrapper[4895]: I1202 08:03:34.908158 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerDied","Data":"d5569a8dc5707a03fbffce20aba3850b0ce2306f264a17617bbc67ca353dd039"} Dec 02 08:03:34 crc kubenswrapper[4895]: I1202 08:03:34.909551 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerStarted","Data":"bd29c04b8b61d5efe30fdef778ac763918b030bc526b58977fd3896105628920"} Dec 02 08:03:34 crc kubenswrapper[4895]: I1202 08:03:34.911459 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:03:36 crc kubenswrapper[4895]: I1202 08:03:36.934958 4895 generic.go:334] "Generic (PLEG): container finished" podID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerID="65f9da33cc3a325f0ceae5293cccad7866c0e37476a121ac3ce221e373b8442c" exitCode=0 Dec 02 08:03:36 crc kubenswrapper[4895]: I1202 08:03:36.935290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerDied","Data":"65f9da33cc3a325f0ceae5293cccad7866c0e37476a121ac3ce221e373b8442c"} Dec 02 08:03:38 crc kubenswrapper[4895]: I1202 08:03:38.141684 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:03:38 crc kubenswrapper[4895]: E1202 08:03:38.142304 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:03:38 crc kubenswrapper[4895]: I1202 08:03:38.955224 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerStarted","Data":"c8ff56b7512a2b7a46ae6e7861f9d0687519e44cd4608a93a723b36577c34bc5"} Dec 02 08:03:38 crc kubenswrapper[4895]: I1202 08:03:38.975253 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqj2s" podStartSLOduration=2.966776092 podStartE2EDuration="5.975225498s" podCreationTimestamp="2025-12-02 08:03:33 +0000 UTC" firstStartedPulling="2025-12-02 08:03:34.911123803 +0000 UTC m=+2426.081983416" lastFinishedPulling="2025-12-02 08:03:37.919573209 +0000 UTC m=+2429.090432822" observedRunningTime="2025-12-02 08:03:38.973166274 +0000 UTC m=+2430.144025907" watchObservedRunningTime="2025-12-02 08:03:38.975225498 +0000 UTC m=+2430.146085111" Dec 02 08:03:43 crc kubenswrapper[4895]: I1202 08:03:43.752821 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:43 crc kubenswrapper[4895]: I1202 08:03:43.755031 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:43 crc kubenswrapper[4895]: I1202 08:03:43.792887 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:44 crc kubenswrapper[4895]: I1202 08:03:44.029875 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:44 crc kubenswrapper[4895]: I1202 08:03:44.076490 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:46 crc kubenswrapper[4895]: I1202 08:03:46.009612 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqj2s" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="registry-server" containerID="cri-o://c8ff56b7512a2b7a46ae6e7861f9d0687519e44cd4608a93a723b36577c34bc5" gracePeriod=2 Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.021565 4895 generic.go:334] "Generic (PLEG): container finished" podID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerID="c8ff56b7512a2b7a46ae6e7861f9d0687519e44cd4608a93a723b36577c34bc5" exitCode=0 Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.021650 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerDied","Data":"c8ff56b7512a2b7a46ae6e7861f9d0687519e44cd4608a93a723b36577c34bc5"} Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.498765 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.599901 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content\") pod \"06c8791b-365b-4ac6-bea0-4144e069e7bd\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.600094 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities\") pod \"06c8791b-365b-4ac6-bea0-4144e069e7bd\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.600134 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh6sz\" (UniqueName: \"kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz\") pod \"06c8791b-365b-4ac6-bea0-4144e069e7bd\" (UID: \"06c8791b-365b-4ac6-bea0-4144e069e7bd\") " Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.601005 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities" (OuterVolumeSpecName: "utilities") pod "06c8791b-365b-4ac6-bea0-4144e069e7bd" (UID: "06c8791b-365b-4ac6-bea0-4144e069e7bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.605907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz" (OuterVolumeSpecName: "kube-api-access-fh6sz") pod "06c8791b-365b-4ac6-bea0-4144e069e7bd" (UID: "06c8791b-365b-4ac6-bea0-4144e069e7bd"). InnerVolumeSpecName "kube-api-access-fh6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.652335 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06c8791b-365b-4ac6-bea0-4144e069e7bd" (UID: "06c8791b-365b-4ac6-bea0-4144e069e7bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.702023 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.702065 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh6sz\" (UniqueName: \"kubernetes.io/projected/06c8791b-365b-4ac6-bea0-4144e069e7bd-kube-api-access-fh6sz\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:47 crc kubenswrapper[4895]: I1202 08:03:47.702098 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06c8791b-365b-4ac6-bea0-4144e069e7bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.030052 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqj2s" event={"ID":"06c8791b-365b-4ac6-bea0-4144e069e7bd","Type":"ContainerDied","Data":"bd29c04b8b61d5efe30fdef778ac763918b030bc526b58977fd3896105628920"} Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.030117 4895 scope.go:117] "RemoveContainer" containerID="c8ff56b7512a2b7a46ae6e7861f9d0687519e44cd4608a93a723b36577c34bc5" Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.030145 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqj2s" Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.069755 4895 scope.go:117] "RemoveContainer" containerID="65f9da33cc3a325f0ceae5293cccad7866c0e37476a121ac3ce221e373b8442c" Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.071334 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.078144 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqj2s"] Dec 02 08:03:48 crc kubenswrapper[4895]: I1202 08:03:48.089848 4895 scope.go:117] "RemoveContainer" containerID="d5569a8dc5707a03fbffce20aba3850b0ce2306f264a17617bbc67ca353dd039" Dec 02 08:03:49 crc kubenswrapper[4895]: I1202 08:03:49.151098 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" path="/var/lib/kubelet/pods/06c8791b-365b-4ac6-bea0-4144e069e7bd/volumes" Dec 02 08:03:53 crc kubenswrapper[4895]: I1202 08:03:53.141545 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:03:53 crc kubenswrapper[4895]: E1202 08:03:53.141840 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:04:07 crc kubenswrapper[4895]: I1202 08:04:07.141667 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:04:07 crc kubenswrapper[4895]: E1202 08:04:07.142503 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:04:21 crc kubenswrapper[4895]: I1202 08:04:21.142403 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:04:21 crc kubenswrapper[4895]: E1202 08:04:21.143824 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:04:35 crc kubenswrapper[4895]: I1202 08:04:35.142018 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:04:35 crc kubenswrapper[4895]: E1202 08:04:35.142852 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:04:49 crc kubenswrapper[4895]: I1202 08:04:49.148449 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:04:49 crc kubenswrapper[4895]: E1202 08:04:49.151544 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:05:03 crc kubenswrapper[4895]: I1202 08:05:03.141223 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:05:03 crc kubenswrapper[4895]: E1202 08:05:03.142165 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:05:16 crc kubenswrapper[4895]: I1202 08:05:16.141278 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:05:16 crc kubenswrapper[4895]: E1202 08:05:16.142079 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:05:30 crc kubenswrapper[4895]: I1202 08:05:30.141669 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:05:30 crc kubenswrapper[4895]: E1202 08:05:30.142789 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:05:45 crc kubenswrapper[4895]: I1202 08:05:45.141793 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:05:46 crc kubenswrapper[4895]: I1202 08:05:46.011915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c"} Dec 02 08:08:05 crc kubenswrapper[4895]: I1202 08:08:05.473088 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:08:05 crc kubenswrapper[4895]: I1202 08:08:05.473707 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:08:35 crc kubenswrapper[4895]: I1202 08:08:35.473795 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:08:35 crc kubenswrapper[4895]: I1202 08:08:35.474373 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.473987 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.474629 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.474688 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.475454 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.475511 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c" gracePeriod=600 Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.672640 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c" exitCode=0 Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.672693 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c"} Dec 02 08:09:05 crc kubenswrapper[4895]: I1202 08:09:05.672762 4895 scope.go:117] "RemoveContainer" containerID="238d09bc62d590c652fe96ecc9345d3c3a5705c84a89d4f55cbeac25f302e139" Dec 02 08:09:06 crc kubenswrapper[4895]: I1202 08:09:06.682826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977"} Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.265306 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:15 crc kubenswrapper[4895]: E1202 08:09:15.267528 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="extract-content" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.267597 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="extract-content" Dec 02 08:09:15 crc kubenswrapper[4895]: E1202 08:09:15.267629 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="registry-server" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.267638 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="registry-server" Dec 02 08:09:15 crc kubenswrapper[4895]: E1202 08:09:15.267657 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="extract-utilities" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.267671 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="extract-utilities" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.267935 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c8791b-365b-4ac6-bea0-4144e069e7bd" containerName="registry-server" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.269191 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.286464 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.342914 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.343028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5zh\" (UniqueName: \"kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.343286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.444777 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5zh\" (UniqueName: \"kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.444866 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.444961 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.445802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.445816 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.475043 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5zh\" (UniqueName: \"kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh\") pod \"certified-operators-72zpq\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:15 crc kubenswrapper[4895]: I1202 08:09:15.624653 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:16 crc kubenswrapper[4895]: I1202 08:09:16.163954 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:16 crc kubenswrapper[4895]: I1202 08:09:16.758551 4895 generic.go:334] "Generic (PLEG): container finished" podID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerID="a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6" exitCode=0 Dec 02 08:09:16 crc kubenswrapper[4895]: I1202 08:09:16.758604 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerDied","Data":"a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6"} Dec 02 08:09:16 crc kubenswrapper[4895]: I1202 08:09:16.758896 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerStarted","Data":"f9ebf7ee2b70aa46cc8df1dfe2abb0968a39a4d47371a2bfdb4c7cb2a4b1893c"} Dec 02 08:09:16 crc kubenswrapper[4895]: I1202 08:09:16.762167 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:09:17 crc kubenswrapper[4895]: I1202 08:09:17.768777 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerStarted","Data":"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859"} Dec 02 08:09:18 crc kubenswrapper[4895]: I1202 08:09:18.778569 4895 generic.go:334] "Generic (PLEG): container finished" podID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerID="dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859" exitCode=0 Dec 02 08:09:18 crc kubenswrapper[4895]: I1202 08:09:18.778706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerDied","Data":"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859"} Dec 02 08:09:19 crc kubenswrapper[4895]: I1202 08:09:19.789526 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerStarted","Data":"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d"} Dec 02 08:09:19 crc kubenswrapper[4895]: I1202 08:09:19.808594 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72zpq" podStartSLOduration=2.364520714 podStartE2EDuration="4.808569515s" podCreationTimestamp="2025-12-02 08:09:15 +0000 UTC" firstStartedPulling="2025-12-02 08:09:16.761831062 +0000 UTC m=+2767.932690675" lastFinishedPulling="2025-12-02 08:09:19.205879863 +0000 UTC m=+2770.376739476" observedRunningTime="2025-12-02 08:09:19.805221271 +0000 UTC m=+2770.976080884" watchObservedRunningTime="2025-12-02 08:09:19.808569515 +0000 UTC m=+2770.979429128" Dec 02 08:09:25 crc kubenswrapper[4895]: I1202 08:09:25.625709 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:25 crc kubenswrapper[4895]: I1202 08:09:25.626617 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:25 crc kubenswrapper[4895]: I1202 08:09:25.666512 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:25 crc kubenswrapper[4895]: I1202 08:09:25.882134 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:25 crc kubenswrapper[4895]: I1202 08:09:25.946091 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:27 crc kubenswrapper[4895]: I1202 08:09:27.846301 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72zpq" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="registry-server" containerID="cri-o://1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d" gracePeriod=2 Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.471259 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.574505 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc5zh\" (UniqueName: \"kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh\") pod \"b698d496-7a66-460c-9f2a-82fb23e3ed69\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.574614 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content\") pod \"b698d496-7a66-460c-9f2a-82fb23e3ed69\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.574712 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities\") pod \"b698d496-7a66-460c-9f2a-82fb23e3ed69\" (UID: \"b698d496-7a66-460c-9f2a-82fb23e3ed69\") " Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.575801 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities" (OuterVolumeSpecName: "utilities") pod "b698d496-7a66-460c-9f2a-82fb23e3ed69" (UID: "b698d496-7a66-460c-9f2a-82fb23e3ed69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.582077 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh" (OuterVolumeSpecName: "kube-api-access-fc5zh") pod "b698d496-7a66-460c-9f2a-82fb23e3ed69" (UID: "b698d496-7a66-460c-9f2a-82fb23e3ed69"). InnerVolumeSpecName "kube-api-access-fc5zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.628420 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b698d496-7a66-460c-9f2a-82fb23e3ed69" (UID: "b698d496-7a66-460c-9f2a-82fb23e3ed69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.676818 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc5zh\" (UniqueName: \"kubernetes.io/projected/b698d496-7a66-460c-9f2a-82fb23e3ed69-kube-api-access-fc5zh\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.676931 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.676954 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b698d496-7a66-460c-9f2a-82fb23e3ed69-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.856329 4895 generic.go:334] "Generic (PLEG): container finished" podID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerID="1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d" exitCode=0 Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.856395 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerDied","Data":"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d"} Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.856441 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zpq" event={"ID":"b698d496-7a66-460c-9f2a-82fb23e3ed69","Type":"ContainerDied","Data":"f9ebf7ee2b70aa46cc8df1dfe2abb0968a39a4d47371a2bfdb4c7cb2a4b1893c"} Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.856467 4895 scope.go:117] "RemoveContainer" containerID="1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.857370 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zpq" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.876352 4895 scope.go:117] "RemoveContainer" containerID="dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.897266 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.903801 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72zpq"] Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.918135 4895 scope.go:117] "RemoveContainer" containerID="a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.938530 4895 scope.go:117] "RemoveContainer" containerID="1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d" Dec 02 08:09:28 crc kubenswrapper[4895]: E1202 08:09:28.939019 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d\": container with ID starting with 1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d not found: ID does not exist" containerID="1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.939054 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d"} err="failed to get container status \"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d\": rpc error: code = NotFound desc = could not find container \"1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d\": container with ID starting with 1e50afd271d637bebeb913e5eb3b88de397ce922b56321cd023a4fc8497b287d not found: ID does not exist" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.939084 4895 scope.go:117] "RemoveContainer" containerID="dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859" Dec 02 08:09:28 crc kubenswrapper[4895]: E1202 08:09:28.939464 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859\": container with ID starting with dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859 not found: ID does not exist" containerID="dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.939550 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859"} err="failed to get container status \"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859\": rpc error: code = NotFound desc = could not find container \"dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859\": container with ID starting with dc8b7b401702b1cc0fc87c7bac00ba37a1bdfbbf36f57506ab7dc155fd999859 not found: ID does not exist" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.939615 4895 scope.go:117] "RemoveContainer" containerID="a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6" Dec 02 08:09:28 crc kubenswrapper[4895]: E1202 08:09:28.940338 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6\": container with ID starting with a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6 not found: ID does not exist" containerID="a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6" Dec 02 08:09:28 crc kubenswrapper[4895]: I1202 08:09:28.940395 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6"} err="failed to get container status \"a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6\": rpc error: code = NotFound desc = could not find container \"a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6\": container with ID starting with a1721a29e38ac09a164c8263ddbcb265df33dda61d154a4f8d804531bbb03af6 not found: ID does not exist" Dec 02 08:09:29 crc kubenswrapper[4895]: I1202 08:09:29.150374 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" path="/var/lib/kubelet/pods/b698d496-7a66-460c-9f2a-82fb23e3ed69/volumes" Dec 02 08:11:05 crc kubenswrapper[4895]: I1202 08:11:05.474155 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:11:05 crc kubenswrapper[4895]: I1202 08:11:05.474935 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:11:35 crc kubenswrapper[4895]: I1202 08:11:35.473764 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:11:35 crc kubenswrapper[4895]: I1202 08:11:35.474394 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:12:05 crc kubenswrapper[4895]: I1202 08:12:05.473851 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:12:05 crc kubenswrapper[4895]: I1202 08:12:05.474694 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:12:05 crc kubenswrapper[4895]: I1202 08:12:05.474827 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:12:05 crc kubenswrapper[4895]: I1202 08:12:05.475794 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:12:05 crc kubenswrapper[4895]: I1202 08:12:05.475907 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" gracePeriod=600 Dec 02 08:12:05 crc kubenswrapper[4895]: E1202 08:12:05.591622 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0468d2d1_a975_45a6_af9f_47adc432fab0.slice/crio-facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0468d2d1_a975_45a6_af9f_47adc432fab0.slice/crio-conmon-facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:12:05 crc kubenswrapper[4895]: E1202 08:12:05.598906 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:12:06 crc kubenswrapper[4895]: I1202 08:12:06.112993 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" exitCode=0 Dec 02 08:12:06 crc kubenswrapper[4895]: I1202 08:12:06.113065 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977"} Dec 02 08:12:06 crc kubenswrapper[4895]: I1202 08:12:06.113111 4895 scope.go:117] "RemoveContainer" containerID="52a93dd2b6f73ad029251d44c93237d97d9df4ee4ea0c15ada2a5ea88c35966c" Dec 02 08:12:06 crc kubenswrapper[4895]: I1202 08:12:06.113909 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:12:06 crc kubenswrapper[4895]: E1202 08:12:06.114199 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:12:17 crc kubenswrapper[4895]: I1202 08:12:17.141255 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:12:17 crc kubenswrapper[4895]: E1202 08:12:17.142170 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:12:30 crc kubenswrapper[4895]: I1202 08:12:30.140992 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:12:30 crc kubenswrapper[4895]: E1202 08:12:30.141764 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:12:42 crc kubenswrapper[4895]: I1202 08:12:42.141998 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:12:42 crc kubenswrapper[4895]: E1202 08:12:42.142997 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:12:56 crc kubenswrapper[4895]: I1202 08:12:56.141196 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:12:56 crc kubenswrapper[4895]: E1202 08:12:56.142131 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:13:09 crc kubenswrapper[4895]: I1202 08:13:09.147409 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:13:09 crc kubenswrapper[4895]: E1202 08:13:09.148452 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:13:22 crc kubenswrapper[4895]: I1202 08:13:22.141353 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:13:22 crc kubenswrapper[4895]: E1202 08:13:22.142127 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:13:33 crc kubenswrapper[4895]: I1202 08:13:33.141559 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:13:33 crc kubenswrapper[4895]: E1202 08:13:33.142235 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:13:46 crc kubenswrapper[4895]: I1202 08:13:46.141285 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:13:46 crc kubenswrapper[4895]: E1202 08:13:46.142417 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:13:58 crc kubenswrapper[4895]: I1202 08:13:58.140641 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:13:58 crc kubenswrapper[4895]: E1202 08:13:58.141394 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:14:12 crc kubenswrapper[4895]: I1202 08:14:12.141581 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:14:12 crc kubenswrapper[4895]: E1202 08:14:12.142668 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:14:26 crc kubenswrapper[4895]: I1202 08:14:26.141800 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:14:26 crc kubenswrapper[4895]: E1202 08:14:26.142558 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:14:39 crc kubenswrapper[4895]: I1202 08:14:39.144970 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:14:39 crc kubenswrapper[4895]: E1202 08:14:39.145877 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:14:54 crc kubenswrapper[4895]: I1202 08:14:54.141352 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:14:54 crc kubenswrapper[4895]: E1202 08:14:54.142176 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.156670 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524"] Dec 02 08:15:00 crc kubenswrapper[4895]: E1202 08:15:00.157591 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="extract-content" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.157606 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="extract-content" Dec 02 08:15:00 crc kubenswrapper[4895]: E1202 08:15:00.157635 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="registry-server" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.157641 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="registry-server" Dec 02 08:15:00 crc kubenswrapper[4895]: E1202 08:15:00.157654 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="extract-utilities" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.157663 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="extract-utilities" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.157855 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b698d496-7a66-460c-9f2a-82fb23e3ed69" containerName="registry-server" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.158514 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.162096 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.162306 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.170635 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524"] Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.230377 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtwpb\" (UniqueName: \"kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.230456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.230573 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.331533 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.331664 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtwpb\" (UniqueName: \"kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.331701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.332836 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.352321 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.352331 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtwpb\" (UniqueName: \"kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb\") pod \"collect-profiles-29411055-sf524\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.499260 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:00 crc kubenswrapper[4895]: I1202 08:15:00.728833 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524"] Dec 02 08:15:01 crc kubenswrapper[4895]: I1202 08:15:01.489256 4895 generic.go:334] "Generic (PLEG): container finished" podID="7d428c26-837d-4f01-ba20-08bee0b90363" containerID="d300717031a0462a0984129bb75e3958ec0bb4646df2606fb9028a0d706051b5" exitCode=0 Dec 02 08:15:01 crc kubenswrapper[4895]: I1202 08:15:01.489304 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" event={"ID":"7d428c26-837d-4f01-ba20-08bee0b90363","Type":"ContainerDied","Data":"d300717031a0462a0984129bb75e3958ec0bb4646df2606fb9028a0d706051b5"} Dec 02 08:15:01 crc kubenswrapper[4895]: I1202 08:15:01.489358 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" event={"ID":"7d428c26-837d-4f01-ba20-08bee0b90363","Type":"ContainerStarted","Data":"556be50c0aa011a09b98bdb118e98f5bc42282474080d5e5e1646d0533c6b065"} Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.768669 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.970471 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume\") pod \"7d428c26-837d-4f01-ba20-08bee0b90363\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.970577 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume\") pod \"7d428c26-837d-4f01-ba20-08bee0b90363\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.970627 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtwpb\" (UniqueName: \"kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb\") pod \"7d428c26-837d-4f01-ba20-08bee0b90363\" (UID: \"7d428c26-837d-4f01-ba20-08bee0b90363\") " Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.971635 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d428c26-837d-4f01-ba20-08bee0b90363" (UID: "7d428c26-837d-4f01-ba20-08bee0b90363"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.971877 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d428c26-837d-4f01-ba20-08bee0b90363-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.976604 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb" (OuterVolumeSpecName: "kube-api-access-dtwpb") pod "7d428c26-837d-4f01-ba20-08bee0b90363" (UID: "7d428c26-837d-4f01-ba20-08bee0b90363"). InnerVolumeSpecName "kube-api-access-dtwpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:15:02 crc kubenswrapper[4895]: I1202 08:15:02.980481 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d428c26-837d-4f01-ba20-08bee0b90363" (UID: "7d428c26-837d-4f01-ba20-08bee0b90363"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.072416 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtwpb\" (UniqueName: \"kubernetes.io/projected/7d428c26-837d-4f01-ba20-08bee0b90363-kube-api-access-dtwpb\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.072486 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d428c26-837d-4f01-ba20-08bee0b90363-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.504055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" event={"ID":"7d428c26-837d-4f01-ba20-08bee0b90363","Type":"ContainerDied","Data":"556be50c0aa011a09b98bdb118e98f5bc42282474080d5e5e1646d0533c6b065"} Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.504319 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556be50c0aa011a09b98bdb118e98f5bc42282474080d5e5e1646d0533c6b065" Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.504109 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524" Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.838980 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7"] Dec 02 08:15:03 crc kubenswrapper[4895]: I1202 08:15:03.844705 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411010-cjlm7"] Dec 02 08:15:05 crc kubenswrapper[4895]: I1202 08:15:05.150278 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a15c2bd-78d9-4178-95b8-170663887f4b" path="/var/lib/kubelet/pods/1a15c2bd-78d9-4178-95b8-170663887f4b/volumes" Dec 02 08:15:06 crc kubenswrapper[4895]: I1202 08:15:06.141136 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:15:06 crc kubenswrapper[4895]: E1202 08:15:06.141671 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:15:17 crc kubenswrapper[4895]: I1202 08:15:17.141615 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:15:17 crc kubenswrapper[4895]: E1202 08:15:17.142504 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:15:28 crc kubenswrapper[4895]: I1202 08:15:28.101862 4895 scope.go:117] "RemoveContainer" containerID="1a8b2be9a073f50336930908d5803e5e4de3461802484b0226855dd3218dec08" Dec 02 08:15:28 crc kubenswrapper[4895]: I1202 08:15:28.141641 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:15:28 crc kubenswrapper[4895]: E1202 08:15:28.141944 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:15:39 crc kubenswrapper[4895]: I1202 08:15:39.147455 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:15:39 crc kubenswrapper[4895]: E1202 08:15:39.148701 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:15:54 crc kubenswrapper[4895]: I1202 08:15:54.141616 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:15:54 crc kubenswrapper[4895]: E1202 08:15:54.144241 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:16:05 crc kubenswrapper[4895]: I1202 08:16:05.141770 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:16:05 crc kubenswrapper[4895]: E1202 08:16:05.142814 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:16:17 crc kubenswrapper[4895]: I1202 08:16:17.142122 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:16:17 crc kubenswrapper[4895]: E1202 08:16:17.142999 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:16:29 crc kubenswrapper[4895]: I1202 08:16:29.153812 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:16:29 crc kubenswrapper[4895]: E1202 08:16:29.155361 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.568033 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:32 crc kubenswrapper[4895]: E1202 08:16:32.570355 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d428c26-837d-4f01-ba20-08bee0b90363" containerName="collect-profiles" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.570413 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d428c26-837d-4f01-ba20-08bee0b90363" containerName="collect-profiles" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.570697 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d428c26-837d-4f01-ba20-08bee0b90363" containerName="collect-profiles" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.575655 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.583657 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.649752 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.650308 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.650415 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9sx\" (UniqueName: \"kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.751965 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9sx\" (UniqueName: \"kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.752032 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.752051 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.752648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.752678 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.774587 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9sx\" (UniqueName: \"kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx\") pod \"community-operators-69k6c\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:32 crc kubenswrapper[4895]: I1202 08:16:32.956125 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:34 crc kubenswrapper[4895]: I1202 08:16:34.182500 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:34 crc kubenswrapper[4895]: I1202 08:16:34.231551 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerStarted","Data":"b8b88035ea441cdae5ebf125fccf12164cb340453740dcc64b2ea11fd5ab2c92"} Dec 02 08:16:35 crc kubenswrapper[4895]: I1202 08:16:35.240833 4895 generic.go:334] "Generic (PLEG): container finished" podID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerID="d7014334b74327694019ab6b82006622ee7d9dd323a6e6983388281797649e35" exitCode=0 Dec 02 08:16:35 crc kubenswrapper[4895]: I1202 08:16:35.240898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerDied","Data":"d7014334b74327694019ab6b82006622ee7d9dd323a6e6983388281797649e35"} Dec 02 08:16:35 crc kubenswrapper[4895]: I1202 08:16:35.244127 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.250493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerStarted","Data":"e5561bbd0aa74ba34829b9a9f60e3933719c505a446cc2eab0d1e6aa756e16a0"} Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.362683 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.364693 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.376684 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.521658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.521735 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2n9\" (UniqueName: \"kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.522013 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.623893 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2n9\" (UniqueName: \"kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.624020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.624068 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.624760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.626050 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.650160 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2n9\" (UniqueName: \"kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9\") pod \"redhat-marketplace-mjv5m\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:36 crc kubenswrapper[4895]: I1202 08:16:36.686274 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.138578 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.261109 4895 generic.go:334] "Generic (PLEG): container finished" podID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerID="e5561bbd0aa74ba34829b9a9f60e3933719c505a446cc2eab0d1e6aa756e16a0" exitCode=0 Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.261235 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerDied","Data":"e5561bbd0aa74ba34829b9a9f60e3933719c505a446cc2eab0d1e6aa756e16a0"} Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.262459 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerStarted","Data":"c34ed1cc1062ba50f11bec404ecf9a90fea647afecedc097572fd4dccb3b0c49"} Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.365936 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.367606 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.373761 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.443130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.443186 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.443534 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqvm\" (UniqueName: \"kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.545047 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqvm\" (UniqueName: \"kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.545332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.545363 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.545823 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.545866 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.563868 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqvm\" (UniqueName: \"kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm\") pod \"redhat-operators-hlf8n\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:37 crc kubenswrapper[4895]: I1202 08:16:37.734102 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:38 crc kubenswrapper[4895]: I1202 08:16:38.247367 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:38 crc kubenswrapper[4895]: I1202 08:16:38.276470 4895 generic.go:334] "Generic (PLEG): container finished" podID="344be053-873e-4d70-931a-2bf419ea2d20" containerID="c7b6065471d40cecea70b5a83b72a3fd5a079dac5cc5e08eedecc50fa3413018" exitCode=0 Dec 02 08:16:38 crc kubenswrapper[4895]: I1202 08:16:38.276556 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerDied","Data":"c7b6065471d40cecea70b5a83b72a3fd5a079dac5cc5e08eedecc50fa3413018"} Dec 02 08:16:38 crc kubenswrapper[4895]: I1202 08:16:38.280141 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerStarted","Data":"c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805"} Dec 02 08:16:38 crc kubenswrapper[4895]: I1202 08:16:38.288516 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerStarted","Data":"276dae0f173ec0f990ad03ea2fb295ccaa298f3825db9a954861a944e8c88cd1"} Dec 02 08:16:39 crc kubenswrapper[4895]: I1202 08:16:39.170676 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-69k6c" podStartSLOduration=4.448411586 podStartE2EDuration="7.17065245s" podCreationTimestamp="2025-12-02 08:16:32 +0000 UTC" firstStartedPulling="2025-12-02 08:16:35.243859988 +0000 UTC m=+3206.414719601" lastFinishedPulling="2025-12-02 08:16:37.966100852 +0000 UTC m=+3209.136960465" observedRunningTime="2025-12-02 08:16:38.331645939 +0000 UTC m=+3209.502505542" watchObservedRunningTime="2025-12-02 08:16:39.17065245 +0000 UTC m=+3210.341512063" Dec 02 08:16:39 crc kubenswrapper[4895]: I1202 08:16:39.297415 4895 generic.go:334] "Generic (PLEG): container finished" podID="f74a79fb-9449-4094-a428-a2582277a38e" containerID="7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff" exitCode=0 Dec 02 08:16:39 crc kubenswrapper[4895]: I1202 08:16:39.297555 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerDied","Data":"7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff"} Dec 02 08:16:40 crc kubenswrapper[4895]: I1202 08:16:40.307767 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerStarted","Data":"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c"} Dec 02 08:16:40 crc kubenswrapper[4895]: I1202 08:16:40.309703 4895 generic.go:334] "Generic (PLEG): container finished" podID="344be053-873e-4d70-931a-2bf419ea2d20" containerID="860a8fe02ba72802ee0e1b3bd8fac4e273464f7c2b6275cc8af417e0dee130c9" exitCode=0 Dec 02 08:16:40 crc kubenswrapper[4895]: I1202 08:16:40.309815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerDied","Data":"860a8fe02ba72802ee0e1b3bd8fac4e273464f7c2b6275cc8af417e0dee130c9"} Dec 02 08:16:41 crc kubenswrapper[4895]: I1202 08:16:41.320002 4895 generic.go:334] "Generic (PLEG): container finished" podID="f74a79fb-9449-4094-a428-a2582277a38e" containerID="38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c" exitCode=0 Dec 02 08:16:41 crc kubenswrapper[4895]: I1202 08:16:41.320116 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerDied","Data":"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c"} Dec 02 08:16:41 crc kubenswrapper[4895]: I1202 08:16:41.325124 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerStarted","Data":"ab49686df4bcd94495d5b86c425010ee96aa86df946811d4126630c5948906ff"} Dec 02 08:16:41 crc kubenswrapper[4895]: I1202 08:16:41.364640 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjv5m" podStartSLOduration=2.714796845 podStartE2EDuration="5.364619507s" podCreationTimestamp="2025-12-02 08:16:36 +0000 UTC" firstStartedPulling="2025-12-02 08:16:38.278464836 +0000 UTC m=+3209.449324449" lastFinishedPulling="2025-12-02 08:16:40.928287498 +0000 UTC m=+3212.099147111" observedRunningTime="2025-12-02 08:16:41.360672284 +0000 UTC m=+3212.531531907" watchObservedRunningTime="2025-12-02 08:16:41.364619507 +0000 UTC m=+3212.535479120" Dec 02 08:16:42 crc kubenswrapper[4895]: I1202 08:16:42.335307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerStarted","Data":"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034"} Dec 02 08:16:42 crc kubenswrapper[4895]: I1202 08:16:42.360000 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hlf8n" podStartSLOduration=2.643606099 podStartE2EDuration="5.35998s" podCreationTimestamp="2025-12-02 08:16:37 +0000 UTC" firstStartedPulling="2025-12-02 08:16:39.298971401 +0000 UTC m=+3210.469831014" lastFinishedPulling="2025-12-02 08:16:42.015345302 +0000 UTC m=+3213.186204915" observedRunningTime="2025-12-02 08:16:42.353684824 +0000 UTC m=+3213.524544437" watchObservedRunningTime="2025-12-02 08:16:42.35998 +0000 UTC m=+3213.530839603" Dec 02 08:16:42 crc kubenswrapper[4895]: I1202 08:16:42.956403 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:42 crc kubenswrapper[4895]: I1202 08:16:42.956467 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:43 crc kubenswrapper[4895]: I1202 08:16:43.057355 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:43 crc kubenswrapper[4895]: I1202 08:16:43.387924 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:44 crc kubenswrapper[4895]: I1202 08:16:44.141624 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:16:44 crc kubenswrapper[4895]: E1202 08:16:44.141903 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:16:46 crc kubenswrapper[4895]: I1202 08:16:46.687446 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:46 crc kubenswrapper[4895]: I1202 08:16:46.687807 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:46 crc kubenswrapper[4895]: I1202 08:16:46.784083 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:47 crc kubenswrapper[4895]: I1202 08:16:47.420356 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:47 crc kubenswrapper[4895]: I1202 08:16:47.734774 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:47 crc kubenswrapper[4895]: I1202 08:16:47.735110 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:47 crc kubenswrapper[4895]: I1202 08:16:47.779269 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:48 crc kubenswrapper[4895]: I1202 08:16:48.436843 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:49 crc kubenswrapper[4895]: I1202 08:16:49.951674 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:49 crc kubenswrapper[4895]: I1202 08:16:49.951969 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-69k6c" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="registry-server" containerID="cri-o://c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" gracePeriod=2 Dec 02 08:16:50 crc kubenswrapper[4895]: I1202 08:16:50.963673 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:50 crc kubenswrapper[4895]: I1202 08:16:50.964458 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjv5m" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="registry-server" containerID="cri-o://ab49686df4bcd94495d5b86c425010ee96aa86df946811d4126630c5948906ff" gracePeriod=2 Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.360695 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.362222 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hlf8n" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="registry-server" containerID="cri-o://e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034" gracePeriod=2 Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.419647 4895 generic.go:334] "Generic (PLEG): container finished" podID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerID="c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" exitCode=0 Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.419730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerDied","Data":"c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805"} Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.422488 4895 generic.go:334] "Generic (PLEG): container finished" podID="344be053-873e-4d70-931a-2bf419ea2d20" containerID="ab49686df4bcd94495d5b86c425010ee96aa86df946811d4126630c5948906ff" exitCode=0 Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.422546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerDied","Data":"ab49686df4bcd94495d5b86c425010ee96aa86df946811d4126630c5948906ff"} Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.754251 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.822538 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content\") pod \"f74a79fb-9449-4094-a428-a2582277a38e\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.822733 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities\") pod \"f74a79fb-9449-4094-a428-a2582277a38e\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.822876 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jqvm\" (UniqueName: \"kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm\") pod \"f74a79fb-9449-4094-a428-a2582277a38e\" (UID: \"f74a79fb-9449-4094-a428-a2582277a38e\") " Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.823778 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities" (OuterVolumeSpecName: "utilities") pod "f74a79fb-9449-4094-a428-a2582277a38e" (UID: "f74a79fb-9449-4094-a428-a2582277a38e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.829937 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm" (OuterVolumeSpecName: "kube-api-access-6jqvm") pod "f74a79fb-9449-4094-a428-a2582277a38e" (UID: "f74a79fb-9449-4094-a428-a2582277a38e"). InnerVolumeSpecName "kube-api-access-6jqvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.924864 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.924893 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jqvm\" (UniqueName: \"kubernetes.io/projected/f74a79fb-9449-4094-a428-a2582277a38e-kube-api-access-6jqvm\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:52 crc kubenswrapper[4895]: I1202 08:16:52.938500 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f74a79fb-9449-4094-a428-a2582277a38e" (UID: "f74a79fb-9449-4094-a428-a2582277a38e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:52 crc kubenswrapper[4895]: E1202 08:16:52.957440 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805 is running failed: container process not found" containerID="c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 08:16:52 crc kubenswrapper[4895]: E1202 08:16:52.958188 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805 is running failed: container process not found" containerID="c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 08:16:52 crc kubenswrapper[4895]: E1202 08:16:52.958543 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805 is running failed: container process not found" containerID="c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" cmd=["grpc_health_probe","-addr=:50051"] Dec 02 08:16:52 crc kubenswrapper[4895]: E1202 08:16:52.958569 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-69k6c" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="registry-server" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.026481 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.026663 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a79fb-9449-4094-a428-a2582277a38e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.127531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content\") pod \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.127694 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities\") pod \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.127718 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp9sx\" (UniqueName: \"kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx\") pod \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\" (UID: \"6dbbeb5d-27e5-43ad-84bd-33e9970246b2\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.128548 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities" (OuterVolumeSpecName: "utilities") pod "6dbbeb5d-27e5-43ad-84bd-33e9970246b2" (UID: "6dbbeb5d-27e5-43ad-84bd-33e9970246b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.130367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx" (OuterVolumeSpecName: "kube-api-access-rp9sx") pod "6dbbeb5d-27e5-43ad-84bd-33e9970246b2" (UID: "6dbbeb5d-27e5-43ad-84bd-33e9970246b2"). InnerVolumeSpecName "kube-api-access-rp9sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.146918 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.191190 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dbbeb5d-27e5-43ad-84bd-33e9970246b2" (UID: "6dbbeb5d-27e5-43ad-84bd-33e9970246b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.228951 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities\") pod \"344be053-873e-4d70-931a-2bf419ea2d20\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229070 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content\") pod \"344be053-873e-4d70-931a-2bf419ea2d20\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229275 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2n9\" (UniqueName: \"kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9\") pod \"344be053-873e-4d70-931a-2bf419ea2d20\" (UID: \"344be053-873e-4d70-931a-2bf419ea2d20\") " Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229626 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229644 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229763 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp9sx\" (UniqueName: \"kubernetes.io/projected/6dbbeb5d-27e5-43ad-84bd-33e9970246b2-kube-api-access-rp9sx\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.229977 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities" (OuterVolumeSpecName: "utilities") pod "344be053-873e-4d70-931a-2bf419ea2d20" (UID: "344be053-873e-4d70-931a-2bf419ea2d20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.231682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9" (OuterVolumeSpecName: "kube-api-access-2r2n9") pod "344be053-873e-4d70-931a-2bf419ea2d20" (UID: "344be053-873e-4d70-931a-2bf419ea2d20"). InnerVolumeSpecName "kube-api-access-2r2n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.248809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "344be053-873e-4d70-931a-2bf419ea2d20" (UID: "344be053-873e-4d70-931a-2bf419ea2d20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.330985 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2n9\" (UniqueName: \"kubernetes.io/projected/344be053-873e-4d70-931a-2bf419ea2d20-kube-api-access-2r2n9\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.331025 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.331039 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/344be053-873e-4d70-931a-2bf419ea2d20-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.431568 4895 generic.go:334] "Generic (PLEG): container finished" podID="f74a79fb-9449-4094-a428-a2582277a38e" containerID="e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034" exitCode=0 Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.431609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerDied","Data":"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034"} Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.431672 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hlf8n" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.431732 4895 scope.go:117] "RemoveContainer" containerID="e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.431928 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hlf8n" event={"ID":"f74a79fb-9449-4094-a428-a2582277a38e","Type":"ContainerDied","Data":"276dae0f173ec0f990ad03ea2fb295ccaa298f3825db9a954861a944e8c88cd1"} Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.435854 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjv5m" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.435875 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjv5m" event={"ID":"344be053-873e-4d70-931a-2bf419ea2d20","Type":"ContainerDied","Data":"c34ed1cc1062ba50f11bec404ecf9a90fea647afecedc097572fd4dccb3b0c49"} Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.439224 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-69k6c" event={"ID":"6dbbeb5d-27e5-43ad-84bd-33e9970246b2","Type":"ContainerDied","Data":"b8b88035ea441cdae5ebf125fccf12164cb340453740dcc64b2ea11fd5ab2c92"} Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.439324 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-69k6c" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.458484 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.459916 4895 scope.go:117] "RemoveContainer" containerID="38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.469419 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hlf8n"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.506090 4895 scope.go:117] "RemoveContainer" containerID="7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.512288 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.522019 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjv5m"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.527545 4895 scope.go:117] "RemoveContainer" containerID="e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034" Dec 02 08:16:53 crc kubenswrapper[4895]: E1202 08:16:53.528262 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034\": container with ID starting with e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034 not found: ID does not exist" containerID="e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.528367 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034"} err="failed to get container status \"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034\": rpc error: code = NotFound desc = could not find container \"e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034\": container with ID starting with e82cf666b80dedcb7dbc8245770c56eb5937726fb8095d1885074a8924bc6034 not found: ID does not exist" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.528411 4895 scope.go:117] "RemoveContainer" containerID="38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c" Dec 02 08:16:53 crc kubenswrapper[4895]: E1202 08:16:53.528958 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c\": container with ID starting with 38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c not found: ID does not exist" containerID="38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.529018 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c"} err="failed to get container status \"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c\": rpc error: code = NotFound desc = could not find container \"38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c\": container with ID starting with 38e2e4fb70856fec952f01f015c4f82f41f944e4cb0dfa154eedcf39c3cb1c2c not found: ID does not exist" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.529055 4895 scope.go:117] "RemoveContainer" containerID="7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff" Dec 02 08:16:53 crc kubenswrapper[4895]: E1202 08:16:53.529461 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff\": container with ID starting with 7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff not found: ID does not exist" containerID="7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.529503 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff"} err="failed to get container status \"7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff\": rpc error: code = NotFound desc = could not find container \"7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff\": container with ID starting with 7c48d964d247ed01144865410bf306603368a86b9796c47b80e8a90ef3771bff not found: ID does not exist" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.529535 4895 scope.go:117] "RemoveContainer" containerID="ab49686df4bcd94495d5b86c425010ee96aa86df946811d4126630c5948906ff" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.529602 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.537090 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-69k6c"] Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.552097 4895 scope.go:117] "RemoveContainer" containerID="860a8fe02ba72802ee0e1b3bd8fac4e273464f7c2b6275cc8af417e0dee130c9" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.577814 4895 scope.go:117] "RemoveContainer" containerID="c7b6065471d40cecea70b5a83b72a3fd5a079dac5cc5e08eedecc50fa3413018" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.649511 4895 scope.go:117] "RemoveContainer" containerID="c03e01d70a48c06d1ba31d943ee725b6d13963ddd35a1421447845b4d95d7805" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.675089 4895 scope.go:117] "RemoveContainer" containerID="e5561bbd0aa74ba34829b9a9f60e3933719c505a446cc2eab0d1e6aa756e16a0" Dec 02 08:16:53 crc kubenswrapper[4895]: I1202 08:16:53.701282 4895 scope.go:117] "RemoveContainer" containerID="d7014334b74327694019ab6b82006622ee7d9dd323a6e6983388281797649e35" Dec 02 08:16:55 crc kubenswrapper[4895]: I1202 08:16:55.149615 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="344be053-873e-4d70-931a-2bf419ea2d20" path="/var/lib/kubelet/pods/344be053-873e-4d70-931a-2bf419ea2d20/volumes" Dec 02 08:16:55 crc kubenswrapper[4895]: I1202 08:16:55.151250 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" path="/var/lib/kubelet/pods/6dbbeb5d-27e5-43ad-84bd-33e9970246b2/volumes" Dec 02 08:16:55 crc kubenswrapper[4895]: I1202 08:16:55.152149 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74a79fb-9449-4094-a428-a2582277a38e" path="/var/lib/kubelet/pods/f74a79fb-9449-4094-a428-a2582277a38e/volumes" Dec 02 08:16:59 crc kubenswrapper[4895]: I1202 08:16:59.145632 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:16:59 crc kubenswrapper[4895]: E1202 08:16:59.146277 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:17:10 crc kubenswrapper[4895]: I1202 08:17:10.141213 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:17:10 crc kubenswrapper[4895]: I1202 08:17:10.600840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8"} Dec 02 08:19:35 crc kubenswrapper[4895]: I1202 08:19:35.473932 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:19:35 crc kubenswrapper[4895]: I1202 08:19:35.474599 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.870265 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.871951 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.871970 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.871983 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.871989 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872006 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872012 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872024 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872030 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872039 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872044 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872056 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872061 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872078 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872084 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="extract-content" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872090 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872125 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: E1202 08:20:02.872141 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872146 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="extract-utilities" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872293 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74a79fb-9449-4094-a428-a2582277a38e" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872307 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="344be053-873e-4d70-931a-2bf419ea2d20" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.872316 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbbeb5d-27e5-43ad-84bd-33e9970246b2" containerName="registry-server" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.873531 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.881993 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.929907 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.929949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:02 crc kubenswrapper[4895]: I1202 08:20:02.930018 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-926p9\" (UniqueName: \"kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.032325 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-926p9\" (UniqueName: \"kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.032442 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.032467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.033102 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.033112 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.055335 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-926p9\" (UniqueName: \"kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9\") pod \"certified-operators-bhq6b\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.197325 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:03 crc kubenswrapper[4895]: I1202 08:20:03.741299 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:04 crc kubenswrapper[4895]: I1202 08:20:04.038805 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerID="2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777" exitCode=0 Dec 02 08:20:04 crc kubenswrapper[4895]: I1202 08:20:04.038936 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerDied","Data":"2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777"} Dec 02 08:20:04 crc kubenswrapper[4895]: I1202 08:20:04.041662 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerStarted","Data":"7c74f2df12c5e1f82b2c8ce090324799bb98b15e19feaf8c3e61437925008b36"} Dec 02 08:20:05 crc kubenswrapper[4895]: I1202 08:20:05.052564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerStarted","Data":"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e"} Dec 02 08:20:05 crc kubenswrapper[4895]: I1202 08:20:05.473653 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:20:05 crc kubenswrapper[4895]: I1202 08:20:05.473797 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:20:06 crc kubenswrapper[4895]: I1202 08:20:06.067901 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerID="a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e" exitCode=0 Dec 02 08:20:06 crc kubenswrapper[4895]: I1202 08:20:06.067968 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerDied","Data":"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e"} Dec 02 08:20:07 crc kubenswrapper[4895]: I1202 08:20:07.080050 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerStarted","Data":"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf"} Dec 02 08:20:07 crc kubenswrapper[4895]: I1202 08:20:07.102399 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhq6b" podStartSLOduration=2.369047476 podStartE2EDuration="5.102376475s" podCreationTimestamp="2025-12-02 08:20:02 +0000 UTC" firstStartedPulling="2025-12-02 08:20:04.04215607 +0000 UTC m=+3415.213015703" lastFinishedPulling="2025-12-02 08:20:06.775485079 +0000 UTC m=+3417.946344702" observedRunningTime="2025-12-02 08:20:07.097251194 +0000 UTC m=+3418.268110847" watchObservedRunningTime="2025-12-02 08:20:07.102376475 +0000 UTC m=+3418.273236078" Dec 02 08:20:13 crc kubenswrapper[4895]: I1202 08:20:13.197578 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:13 crc kubenswrapper[4895]: I1202 08:20:13.198470 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:13 crc kubenswrapper[4895]: I1202 08:20:13.244800 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:14 crc kubenswrapper[4895]: I1202 08:20:14.172279 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:14 crc kubenswrapper[4895]: I1202 08:20:14.216001 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:16 crc kubenswrapper[4895]: I1202 08:20:16.148105 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhq6b" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="registry-server" containerID="cri-o://f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf" gracePeriod=2 Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.045206 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.158178 4895 generic.go:334] "Generic (PLEG): container finished" podID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerID="f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf" exitCode=0 Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.158229 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerDied","Data":"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf"} Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.158281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhq6b" event={"ID":"d0b3137b-126a-460c-84d3-5c20eb12e235","Type":"ContainerDied","Data":"7c74f2df12c5e1f82b2c8ce090324799bb98b15e19feaf8c3e61437925008b36"} Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.158318 4895 scope.go:117] "RemoveContainer" containerID="f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.158315 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhq6b" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.170118 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-926p9\" (UniqueName: \"kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9\") pod \"d0b3137b-126a-460c-84d3-5c20eb12e235\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.170233 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content\") pod \"d0b3137b-126a-460c-84d3-5c20eb12e235\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.170431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities\") pod \"d0b3137b-126a-460c-84d3-5c20eb12e235\" (UID: \"d0b3137b-126a-460c-84d3-5c20eb12e235\") " Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.171182 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities" (OuterVolumeSpecName: "utilities") pod "d0b3137b-126a-460c-84d3-5c20eb12e235" (UID: "d0b3137b-126a-460c-84d3-5c20eb12e235"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.179017 4895 scope.go:117] "RemoveContainer" containerID="a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.180630 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9" (OuterVolumeSpecName: "kube-api-access-926p9") pod "d0b3137b-126a-460c-84d3-5c20eb12e235" (UID: "d0b3137b-126a-460c-84d3-5c20eb12e235"). InnerVolumeSpecName "kube-api-access-926p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.220287 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0b3137b-126a-460c-84d3-5c20eb12e235" (UID: "d0b3137b-126a-460c-84d3-5c20eb12e235"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.225033 4895 scope.go:117] "RemoveContainer" containerID="2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.245635 4895 scope.go:117] "RemoveContainer" containerID="f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf" Dec 02 08:20:17 crc kubenswrapper[4895]: E1202 08:20:17.246443 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf\": container with ID starting with f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf not found: ID does not exist" containerID="f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.246498 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf"} err="failed to get container status \"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf\": rpc error: code = NotFound desc = could not find container \"f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf\": container with ID starting with f28d862ea9d99ef8e555612747a18e56c9203bffdbc563985e0d2dab802437bf not found: ID does not exist" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.246542 4895 scope.go:117] "RemoveContainer" containerID="a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e" Dec 02 08:20:17 crc kubenswrapper[4895]: E1202 08:20:17.246885 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e\": container with ID starting with a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e not found: ID does not exist" containerID="a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.246915 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e"} err="failed to get container status \"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e\": rpc error: code = NotFound desc = could not find container \"a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e\": container with ID starting with a3f36ce3673204e25dfeddb47fcffa167ff3096a341b98d37b0bb0e2ed9d316e not found: ID does not exist" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.246929 4895 scope.go:117] "RemoveContainer" containerID="2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777" Dec 02 08:20:17 crc kubenswrapper[4895]: E1202 08:20:17.247417 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777\": container with ID starting with 2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777 not found: ID does not exist" containerID="2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.247485 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777"} err="failed to get container status \"2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777\": rpc error: code = NotFound desc = could not find container \"2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777\": container with ID starting with 2cb4d2118b7b76b55a7a3ccb1537da37fc8d0f32df1e70e9d87b23be896e1777 not found: ID does not exist" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.273152 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-926p9\" (UniqueName: \"kubernetes.io/projected/d0b3137b-126a-460c-84d3-5c20eb12e235-kube-api-access-926p9\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.273196 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.273206 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b3137b-126a-460c-84d3-5c20eb12e235-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.501975 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:17 crc kubenswrapper[4895]: I1202 08:20:17.507800 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhq6b"] Dec 02 08:20:19 crc kubenswrapper[4895]: I1202 08:20:19.151548 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" path="/var/lib/kubelet/pods/d0b3137b-126a-460c-84d3-5c20eb12e235/volumes" Dec 02 08:20:35 crc kubenswrapper[4895]: I1202 08:20:35.473544 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:20:35 crc kubenswrapper[4895]: I1202 08:20:35.474129 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:20:35 crc kubenswrapper[4895]: I1202 08:20:35.474205 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:20:35 crc kubenswrapper[4895]: I1202 08:20:35.475083 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:20:35 crc kubenswrapper[4895]: I1202 08:20:35.475156 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8" gracePeriod=600 Dec 02 08:20:36 crc kubenswrapper[4895]: I1202 08:20:36.319983 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8" exitCode=0 Dec 02 08:20:36 crc kubenswrapper[4895]: I1202 08:20:36.320031 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8"} Dec 02 08:20:36 crc kubenswrapper[4895]: I1202 08:20:36.320693 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3"} Dec 02 08:20:36 crc kubenswrapper[4895]: I1202 08:20:36.320730 4895 scope.go:117] "RemoveContainer" containerID="facd37752598c06651c649139fe0757b632d9e66966e92bbd8b7b13be5f8e977" Dec 02 08:22:35 crc kubenswrapper[4895]: I1202 08:22:35.473696 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:22:35 crc kubenswrapper[4895]: I1202 08:22:35.475700 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:23:05 crc kubenswrapper[4895]: I1202 08:23:05.474132 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:23:05 crc kubenswrapper[4895]: I1202 08:23:05.475223 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.473908 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.474509 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.474567 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.475312 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.475369 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" gracePeriod=600 Dec 02 08:23:35 crc kubenswrapper[4895]: E1202 08:23:35.594782 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.904966 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" exitCode=0 Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.905022 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3"} Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.905089 4895 scope.go:117] "RemoveContainer" containerID="29becf2820851f021dae62c6240700b44f570ccb9b682a8aba8c4e77ea0d11b8" Dec 02 08:23:35 crc kubenswrapper[4895]: I1202 08:23:35.906087 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:23:35 crc kubenswrapper[4895]: E1202 08:23:35.906558 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:23:47 crc kubenswrapper[4895]: I1202 08:23:47.141177 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:23:47 crc kubenswrapper[4895]: E1202 08:23:47.142444 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:23:58 crc kubenswrapper[4895]: I1202 08:23:58.141243 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:23:58 crc kubenswrapper[4895]: E1202 08:23:58.141985 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:24:11 crc kubenswrapper[4895]: I1202 08:24:11.142335 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:24:11 crc kubenswrapper[4895]: E1202 08:24:11.144347 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:24:25 crc kubenswrapper[4895]: I1202 08:24:25.143381 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:24:25 crc kubenswrapper[4895]: E1202 08:24:25.144698 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:24:36 crc kubenswrapper[4895]: I1202 08:24:36.141169 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:24:36 crc kubenswrapper[4895]: E1202 08:24:36.142014 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:24:47 crc kubenswrapper[4895]: I1202 08:24:47.142124 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:24:47 crc kubenswrapper[4895]: E1202 08:24:47.143880 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:24:59 crc kubenswrapper[4895]: I1202 08:24:59.145532 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:24:59 crc kubenswrapper[4895]: E1202 08:24:59.146567 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:25:10 crc kubenswrapper[4895]: I1202 08:25:10.140553 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:25:10 crc kubenswrapper[4895]: E1202 08:25:10.141378 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:25:24 crc kubenswrapper[4895]: I1202 08:25:24.141540 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:25:24 crc kubenswrapper[4895]: E1202 08:25:24.142320 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:25:38 crc kubenswrapper[4895]: I1202 08:25:38.141570 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:25:38 crc kubenswrapper[4895]: E1202 08:25:38.142505 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:25:49 crc kubenswrapper[4895]: I1202 08:25:49.146201 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:25:49 crc kubenswrapper[4895]: E1202 08:25:49.147037 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:03 crc kubenswrapper[4895]: I1202 08:26:03.141936 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:26:03 crc kubenswrapper[4895]: E1202 08:26:03.143287 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:16 crc kubenswrapper[4895]: I1202 08:26:16.140532 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:26:16 crc kubenswrapper[4895]: E1202 08:26:16.141320 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:30 crc kubenswrapper[4895]: I1202 08:26:30.142206 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:26:30 crc kubenswrapper[4895]: E1202 08:26:30.143407 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.624213 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:37 crc kubenswrapper[4895]: E1202 08:26:37.627788 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="extract-content" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.627945 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="extract-content" Dec 02 08:26:37 crc kubenswrapper[4895]: E1202 08:26:37.628076 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="registry-server" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.628190 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="registry-server" Dec 02 08:26:37 crc kubenswrapper[4895]: E1202 08:26:37.628352 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="extract-utilities" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.628476 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="extract-utilities" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.628966 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b3137b-126a-460c-84d3-5c20eb12e235" containerName="registry-server" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.631197 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.650076 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.751912 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.752111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52p9\" (UniqueName: \"kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.752249 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.854018 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j52p9\" (UniqueName: \"kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.854077 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.854115 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.854693 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.854695 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.876305 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52p9\" (UniqueName: \"kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9\") pod \"redhat-operators-5scxs\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:37 crc kubenswrapper[4895]: I1202 08:26:37.970868 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:38 crc kubenswrapper[4895]: I1202 08:26:38.428214 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:38 crc kubenswrapper[4895]: I1202 08:26:38.468626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerStarted","Data":"1b6ecebcc01f63f0cac7120fc943f2e96eede78ecd9ca5dffa4e44186ebe6516"} Dec 02 08:26:39 crc kubenswrapper[4895]: I1202 08:26:39.478421 4895 generic.go:334] "Generic (PLEG): container finished" podID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerID="8e1d2509bffbd9aa580eb3dba25d892348fa4130f3f6af64ef181c797769cc96" exitCode=0 Dec 02 08:26:39 crc kubenswrapper[4895]: I1202 08:26:39.478468 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerDied","Data":"8e1d2509bffbd9aa580eb3dba25d892348fa4130f3f6af64ef181c797769cc96"} Dec 02 08:26:39 crc kubenswrapper[4895]: I1202 08:26:39.481099 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:26:41 crc kubenswrapper[4895]: I1202 08:26:41.141865 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:26:41 crc kubenswrapper[4895]: E1202 08:26:41.142383 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:41 crc kubenswrapper[4895]: I1202 08:26:41.493989 4895 generic.go:334] "Generic (PLEG): container finished" podID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerID="3d4b3f63c43ebf495488e6955f5710ccff5c7c4fed14355e25b591adc8df0d86" exitCode=0 Dec 02 08:26:41 crc kubenswrapper[4895]: I1202 08:26:41.494031 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerDied","Data":"3d4b3f63c43ebf495488e6955f5710ccff5c7c4fed14355e25b591adc8df0d86"} Dec 02 08:26:43 crc kubenswrapper[4895]: I1202 08:26:43.531827 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerStarted","Data":"5c1abfc17b37d586013a98f87caa588e0e5fa0f5af69413f77e148015261f609"} Dec 02 08:26:43 crc kubenswrapper[4895]: I1202 08:26:43.549560 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5scxs" podStartSLOduration=3.4935704530000002 podStartE2EDuration="6.549543733s" podCreationTimestamp="2025-12-02 08:26:37 +0000 UTC" firstStartedPulling="2025-12-02 08:26:39.480686395 +0000 UTC m=+3810.651546018" lastFinishedPulling="2025-12-02 08:26:42.536659685 +0000 UTC m=+3813.707519298" observedRunningTime="2025-12-02 08:26:43.54880815 +0000 UTC m=+3814.719667783" watchObservedRunningTime="2025-12-02 08:26:43.549543733 +0000 UTC m=+3814.720403346" Dec 02 08:26:47 crc kubenswrapper[4895]: I1202 08:26:47.971998 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:47 crc kubenswrapper[4895]: I1202 08:26:47.972475 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:48 crc kubenswrapper[4895]: I1202 08:26:48.017578 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:48 crc kubenswrapper[4895]: I1202 08:26:48.607633 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:48 crc kubenswrapper[4895]: I1202 08:26:48.654928 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:50 crc kubenswrapper[4895]: I1202 08:26:50.580939 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5scxs" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="registry-server" containerID="cri-o://5c1abfc17b37d586013a98f87caa588e0e5fa0f5af69413f77e148015261f609" gracePeriod=2 Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.599104 4895 generic.go:334] "Generic (PLEG): container finished" podID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerID="5c1abfc17b37d586013a98f87caa588e0e5fa0f5af69413f77e148015261f609" exitCode=0 Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.599442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerDied","Data":"5c1abfc17b37d586013a98f87caa588e0e5fa0f5af69413f77e148015261f609"} Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.831819 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.962301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities\") pod \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.962366 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j52p9\" (UniqueName: \"kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9\") pod \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.962397 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content\") pod \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\" (UID: \"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef\") " Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.963484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities" (OuterVolumeSpecName: "utilities") pod "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" (UID: "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:26:52 crc kubenswrapper[4895]: I1202 08:26:52.967902 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9" (OuterVolumeSpecName: "kube-api-access-j52p9") pod "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" (UID: "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef"). InnerVolumeSpecName "kube-api-access-j52p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.064350 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.064389 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j52p9\" (UniqueName: \"kubernetes.io/projected/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-kube-api-access-j52p9\") on node \"crc\" DevicePath \"\"" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.076516 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" (UID: "85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.141305 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:26:53 crc kubenswrapper[4895]: E1202 08:26:53.141525 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.166268 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.612267 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5scxs" event={"ID":"85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef","Type":"ContainerDied","Data":"1b6ecebcc01f63f0cac7120fc943f2e96eede78ecd9ca5dffa4e44186ebe6516"} Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.612340 4895 scope.go:117] "RemoveContainer" containerID="5c1abfc17b37d586013a98f87caa588e0e5fa0f5af69413f77e148015261f609" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.612400 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5scxs" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.644279 4895 scope.go:117] "RemoveContainer" containerID="3d4b3f63c43ebf495488e6955f5710ccff5c7c4fed14355e25b591adc8df0d86" Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.651943 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.659994 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5scxs"] Dec 02 08:26:53 crc kubenswrapper[4895]: I1202 08:26:53.668370 4895 scope.go:117] "RemoveContainer" containerID="8e1d2509bffbd9aa580eb3dba25d892348fa4130f3f6af64ef181c797769cc96" Dec 02 08:26:55 crc kubenswrapper[4895]: I1202 08:26:55.155020 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" path="/var/lib/kubelet/pods/85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef/volumes" Dec 02 08:27:07 crc kubenswrapper[4895]: I1202 08:27:07.140860 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:27:07 crc kubenswrapper[4895]: E1202 08:27:07.143522 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:27:19 crc kubenswrapper[4895]: I1202 08:27:19.145267 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:27:19 crc kubenswrapper[4895]: E1202 08:27:19.145972 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:27:33 crc kubenswrapper[4895]: I1202 08:27:33.298345 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:27:33 crc kubenswrapper[4895]: E1202 08:27:33.299180 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:27:44 crc kubenswrapper[4895]: I1202 08:27:44.141452 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:27:44 crc kubenswrapper[4895]: E1202 08:27:44.142105 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:27:56 crc kubenswrapper[4895]: I1202 08:27:56.141951 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:27:56 crc kubenswrapper[4895]: E1202 08:27:56.142758 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:28:11 crc kubenswrapper[4895]: I1202 08:28:11.140824 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:28:11 crc kubenswrapper[4895]: E1202 08:28:11.141574 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:28:23 crc kubenswrapper[4895]: I1202 08:28:23.141770 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:28:23 crc kubenswrapper[4895]: E1202 08:28:23.142580 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:28:37 crc kubenswrapper[4895]: I1202 08:28:37.141140 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:28:37 crc kubenswrapper[4895]: I1202 08:28:37.802980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648"} Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.958101 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:28:50 crc kubenswrapper[4895]: E1202 08:28:50.958916 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="extract-content" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.958928 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="extract-content" Dec 02 08:28:50 crc kubenswrapper[4895]: E1202 08:28:50.958949 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="extract-utilities" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.958955 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="extract-utilities" Dec 02 08:28:50 crc kubenswrapper[4895]: E1202 08:28:50.958975 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="registry-server" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.958981 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="registry-server" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.959130 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bd2ce1-1805-4305-a6a0-bfd7ee5e67ef" containerName="registry-server" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.960262 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.978889 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.983122 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgc6\" (UniqueName: \"kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.985886 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:50 crc kubenswrapper[4895]: I1202 08:28:50.986002 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.087277 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgc6\" (UniqueName: \"kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.087332 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.087363 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.087922 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.087952 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.106039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgc6\" (UniqueName: \"kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6\") pod \"redhat-marketplace-nh7ts\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.279002 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.760898 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:28:51 crc kubenswrapper[4895]: I1202 08:28:51.905800 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerStarted","Data":"e68b4e5942ebf514b434960ac4051dc343f83cae8ac050ddcdb750a5b6ebdec3"} Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.352483 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.354257 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.364779 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.404206 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.404269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8g7w\" (UniqueName: \"kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.404417 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.506003 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.506169 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.506232 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8g7w\" (UniqueName: \"kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.506670 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.506694 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.526768 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8g7w\" (UniqueName: \"kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w\") pod \"community-operators-kfbmk\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.684300 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.944177 4895 generic.go:334] "Generic (PLEG): container finished" podID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerID="06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d" exitCode=0 Dec 02 08:28:52 crc kubenswrapper[4895]: I1202 08:28:52.944222 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerDied","Data":"06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d"} Dec 02 08:28:53 crc kubenswrapper[4895]: I1202 08:28:53.288065 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:28:53 crc kubenswrapper[4895]: I1202 08:28:53.951762 4895 generic.go:334] "Generic (PLEG): container finished" podID="f1525c8c-d2f3-4625-8269-a31d80124239" containerID="b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2" exitCode=0 Dec 02 08:28:53 crc kubenswrapper[4895]: I1202 08:28:53.951830 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerDied","Data":"b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2"} Dec 02 08:28:53 crc kubenswrapper[4895]: I1202 08:28:53.951911 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerStarted","Data":"c7d9650522261bd8cc3534aa6fc81b6c47054b20997d0cc5482ef17e8f7ccb54"} Dec 02 08:28:54 crc kubenswrapper[4895]: I1202 08:28:54.975874 4895 generic.go:334] "Generic (PLEG): container finished" podID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerID="f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8" exitCode=0 Dec 02 08:28:54 crc kubenswrapper[4895]: I1202 08:28:54.975888 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerDied","Data":"f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8"} Dec 02 08:28:55 crc kubenswrapper[4895]: I1202 08:28:55.986275 4895 generic.go:334] "Generic (PLEG): container finished" podID="f1525c8c-d2f3-4625-8269-a31d80124239" containerID="63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70" exitCode=0 Dec 02 08:28:55 crc kubenswrapper[4895]: I1202 08:28:55.986405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerDied","Data":"63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70"} Dec 02 08:28:55 crc kubenswrapper[4895]: I1202 08:28:55.990650 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerStarted","Data":"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932"} Dec 02 08:28:56 crc kubenswrapper[4895]: I1202 08:28:56.028871 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nh7ts" podStartSLOduration=3.467081231 podStartE2EDuration="6.028849745s" podCreationTimestamp="2025-12-02 08:28:50 +0000 UTC" firstStartedPulling="2025-12-02 08:28:52.95119889 +0000 UTC m=+3944.122058503" lastFinishedPulling="2025-12-02 08:28:55.512967394 +0000 UTC m=+3946.683827017" observedRunningTime="2025-12-02 08:28:56.027187084 +0000 UTC m=+3947.198046697" watchObservedRunningTime="2025-12-02 08:28:56.028849745 +0000 UTC m=+3947.199709368" Dec 02 08:28:57 crc kubenswrapper[4895]: I1202 08:28:56.999685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerStarted","Data":"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe"} Dec 02 08:28:57 crc kubenswrapper[4895]: I1202 08:28:57.019933 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kfbmk" podStartSLOduration=2.325466459 podStartE2EDuration="5.019911583s" podCreationTimestamp="2025-12-02 08:28:52 +0000 UTC" firstStartedPulling="2025-12-02 08:28:53.959257827 +0000 UTC m=+3945.130117440" lastFinishedPulling="2025-12-02 08:28:56.653702951 +0000 UTC m=+3947.824562564" observedRunningTime="2025-12-02 08:28:57.016519838 +0000 UTC m=+3948.187379451" watchObservedRunningTime="2025-12-02 08:28:57.019911583 +0000 UTC m=+3948.190771206" Dec 02 08:29:01 crc kubenswrapper[4895]: I1202 08:29:01.280165 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:01 crc kubenswrapper[4895]: I1202 08:29:01.280662 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:01 crc kubenswrapper[4895]: I1202 08:29:01.349378 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:02 crc kubenswrapper[4895]: I1202 08:29:02.084268 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:02 crc kubenswrapper[4895]: I1202 08:29:02.134551 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:29:02 crc kubenswrapper[4895]: I1202 08:29:02.685292 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:02 crc kubenswrapper[4895]: I1202 08:29:02.685365 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:02 crc kubenswrapper[4895]: I1202 08:29:02.748134 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:03 crc kubenswrapper[4895]: I1202 08:29:03.097710 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.060242 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nh7ts" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="registry-server" containerID="cri-o://60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932" gracePeriod=2 Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.345805 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.568783 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.725714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content\") pod \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.725788 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxgc6\" (UniqueName: \"kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6\") pod \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.725855 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities\") pod \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\" (UID: \"ee83c563-125d-4bc0-92a7-bba42d9c5d27\") " Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.726754 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities" (OuterVolumeSpecName: "utilities") pod "ee83c563-125d-4bc0-92a7-bba42d9c5d27" (UID: "ee83c563-125d-4bc0-92a7-bba42d9c5d27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.733378 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6" (OuterVolumeSpecName: "kube-api-access-gxgc6") pod "ee83c563-125d-4bc0-92a7-bba42d9c5d27" (UID: "ee83c563-125d-4bc0-92a7-bba42d9c5d27"). InnerVolumeSpecName "kube-api-access-gxgc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.744862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee83c563-125d-4bc0-92a7-bba42d9c5d27" (UID: "ee83c563-125d-4bc0-92a7-bba42d9c5d27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.836455 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.836690 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee83c563-125d-4bc0-92a7-bba42d9c5d27-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:04 crc kubenswrapper[4895]: I1202 08:29:04.836804 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxgc6\" (UniqueName: \"kubernetes.io/projected/ee83c563-125d-4bc0-92a7-bba42d9c5d27-kube-api-access-gxgc6\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071340 4895 generic.go:334] "Generic (PLEG): container finished" podID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerID="60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932" exitCode=0 Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071447 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7ts" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071505 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerDied","Data":"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932"} Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071544 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7ts" event={"ID":"ee83c563-125d-4bc0-92a7-bba42d9c5d27","Type":"ContainerDied","Data":"e68b4e5942ebf514b434960ac4051dc343f83cae8ac050ddcdb750a5b6ebdec3"} Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071549 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kfbmk" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="registry-server" containerID="cri-o://f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe" gracePeriod=2 Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.071573 4895 scope.go:117] "RemoveContainer" containerID="60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.102707 4895 scope.go:117] "RemoveContainer" containerID="f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.116193 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.123462 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7ts"] Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.126489 4895 scope.go:117] "RemoveContainer" containerID="06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.152009 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" path="/var/lib/kubelet/pods/ee83c563-125d-4bc0-92a7-bba42d9c5d27/volumes" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.158595 4895 scope.go:117] "RemoveContainer" containerID="60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932" Dec 02 08:29:05 crc kubenswrapper[4895]: E1202 08:29:05.159016 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932\": container with ID starting with 60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932 not found: ID does not exist" containerID="60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.159157 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932"} err="failed to get container status \"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932\": rpc error: code = NotFound desc = could not find container \"60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932\": container with ID starting with 60aa41cb1bb951acd1cd48977f24db9709539bbd2ee611bc536b30148d225932 not found: ID does not exist" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.159285 4895 scope.go:117] "RemoveContainer" containerID="f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8" Dec 02 08:29:05 crc kubenswrapper[4895]: E1202 08:29:05.159898 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8\": container with ID starting with f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8 not found: ID does not exist" containerID="f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.159997 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8"} err="failed to get container status \"f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8\": rpc error: code = NotFound desc = could not find container \"f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8\": container with ID starting with f6358108d422ee2c758b936eea97faab3ae5516075ed8d31e4445e4e41b109f8 not found: ID does not exist" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.160063 4895 scope.go:117] "RemoveContainer" containerID="06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d" Dec 02 08:29:05 crc kubenswrapper[4895]: E1202 08:29:05.160832 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d\": container with ID starting with 06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d not found: ID does not exist" containerID="06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.160863 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d"} err="failed to get container status \"06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d\": rpc error: code = NotFound desc = could not find container \"06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d\": container with ID starting with 06e01c7971bde57b5447ebdae349e39a15c5f817847e8c4777698b628537c18d not found: ID does not exist" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.441535 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.544179 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities\") pod \"f1525c8c-d2f3-4625-8269-a31d80124239\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.544531 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8g7w\" (UniqueName: \"kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w\") pod \"f1525c8c-d2f3-4625-8269-a31d80124239\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.544703 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content\") pod \"f1525c8c-d2f3-4625-8269-a31d80124239\" (UID: \"f1525c8c-d2f3-4625-8269-a31d80124239\") " Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.545498 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities" (OuterVolumeSpecName: "utilities") pod "f1525c8c-d2f3-4625-8269-a31d80124239" (UID: "f1525c8c-d2f3-4625-8269-a31d80124239"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.550566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w" (OuterVolumeSpecName: "kube-api-access-c8g7w") pod "f1525c8c-d2f3-4625-8269-a31d80124239" (UID: "f1525c8c-d2f3-4625-8269-a31d80124239"). InnerVolumeSpecName "kube-api-access-c8g7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.646812 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8g7w\" (UniqueName: \"kubernetes.io/projected/f1525c8c-d2f3-4625-8269-a31d80124239-kube-api-access-c8g7w\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.646847 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.790105 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1525c8c-d2f3-4625-8269-a31d80124239" (UID: "f1525c8c-d2f3-4625-8269-a31d80124239"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:29:05 crc kubenswrapper[4895]: I1202 08:29:05.849174 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1525c8c-d2f3-4625-8269-a31d80124239-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.081757 4895 generic.go:334] "Generic (PLEG): container finished" podID="f1525c8c-d2f3-4625-8269-a31d80124239" containerID="f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe" exitCode=0 Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.081815 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfbmk" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.081810 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerDied","Data":"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe"} Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.081898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfbmk" event={"ID":"f1525c8c-d2f3-4625-8269-a31d80124239","Type":"ContainerDied","Data":"c7d9650522261bd8cc3534aa6fc81b6c47054b20997d0cc5482ef17e8f7ccb54"} Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.081933 4895 scope.go:117] "RemoveContainer" containerID="f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.107117 4895 scope.go:117] "RemoveContainer" containerID="63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.115292 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.121963 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kfbmk"] Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.145681 4895 scope.go:117] "RemoveContainer" containerID="b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.169517 4895 scope.go:117] "RemoveContainer" containerID="f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe" Dec 02 08:29:06 crc kubenswrapper[4895]: E1202 08:29:06.169928 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe\": container with ID starting with f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe not found: ID does not exist" containerID="f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.169956 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe"} err="failed to get container status \"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe\": rpc error: code = NotFound desc = could not find container \"f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe\": container with ID starting with f2237d278f418011ed3418a9f74311dc21f1d3b8a6e0f324a7366127ec26bdfe not found: ID does not exist" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.169974 4895 scope.go:117] "RemoveContainer" containerID="63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70" Dec 02 08:29:06 crc kubenswrapper[4895]: E1202 08:29:06.170154 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70\": container with ID starting with 63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70 not found: ID does not exist" containerID="63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.170173 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70"} err="failed to get container status \"63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70\": rpc error: code = NotFound desc = could not find container \"63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70\": container with ID starting with 63cdb3f095d618a78c4fcdca3f4ad987f3d5f28ee755a69d2620ad5b59c73c70 not found: ID does not exist" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.170186 4895 scope.go:117] "RemoveContainer" containerID="b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2" Dec 02 08:29:06 crc kubenswrapper[4895]: E1202 08:29:06.170445 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2\": container with ID starting with b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2 not found: ID does not exist" containerID="b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2" Dec 02 08:29:06 crc kubenswrapper[4895]: I1202 08:29:06.170465 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2"} err="failed to get container status \"b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2\": rpc error: code = NotFound desc = could not find container \"b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2\": container with ID starting with b4c610691667e51ccf2fc72b9a49d1c99fb37e2beb374451bd94b23f2c8153e2 not found: ID does not exist" Dec 02 08:29:07 crc kubenswrapper[4895]: I1202 08:29:07.157014 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" path="/var/lib/kubelet/pods/f1525c8c-d2f3-4625-8269-a31d80124239/volumes" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.199945 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9"] Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201100 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="extract-content" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201113 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="extract-content" Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201135 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="extract-utilities" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201161 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="extract-utilities" Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201173 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="extract-utilities" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201179 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="extract-utilities" Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201198 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="extract-content" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201204 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="extract-content" Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201232 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201240 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: E1202 08:30:00.201246 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201252 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201429 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee83c563-125d-4bc0-92a7-bba42d9c5d27" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.201448 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1525c8c-d2f3-4625-8269-a31d80124239" containerName="registry-server" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.202184 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.204099 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.204121 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.214133 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9"] Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.403050 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpbw\" (UniqueName: \"kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.403177 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.403508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.504438 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.504487 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpbw\" (UniqueName: \"kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.504511 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.505363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.513650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.522107 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpbw\" (UniqueName: \"kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw\") pod \"collect-profiles-29411070-4t7z9\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:00 crc kubenswrapper[4895]: I1202 08:30:00.822261 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:01 crc kubenswrapper[4895]: I1202 08:30:01.218118 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9"] Dec 02 08:30:01 crc kubenswrapper[4895]: I1202 08:30:01.535540 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" event={"ID":"52f3bf5b-6437-4d51-ab30-adef4a9f0753","Type":"ContainerStarted","Data":"857ea8d5b4f835cbbf213d62c5fa0279820ef248c2010fe8d06e9c9ff762ed94"} Dec 02 08:30:01 crc kubenswrapper[4895]: I1202 08:30:01.536038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" event={"ID":"52f3bf5b-6437-4d51-ab30-adef4a9f0753","Type":"ContainerStarted","Data":"055db3b723e2090cac0b2c2233a7a53d5e09c2ee983d80320996bb74cffc433a"} Dec 02 08:30:01 crc kubenswrapper[4895]: I1202 08:30:01.561156 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" podStartSLOduration=1.561142812 podStartE2EDuration="1.561142812s" podCreationTimestamp="2025-12-02 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:30:01.558895252 +0000 UTC m=+4012.729754865" watchObservedRunningTime="2025-12-02 08:30:01.561142812 +0000 UTC m=+4012.732002425" Dec 02 08:30:02 crc kubenswrapper[4895]: I1202 08:30:02.544486 4895 generic.go:334] "Generic (PLEG): container finished" podID="52f3bf5b-6437-4d51-ab30-adef4a9f0753" containerID="857ea8d5b4f835cbbf213d62c5fa0279820ef248c2010fe8d06e9c9ff762ed94" exitCode=0 Dec 02 08:30:02 crc kubenswrapper[4895]: I1202 08:30:02.544536 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" event={"ID":"52f3bf5b-6437-4d51-ab30-adef4a9f0753","Type":"ContainerDied","Data":"857ea8d5b4f835cbbf213d62c5fa0279820ef248c2010fe8d06e9c9ff762ed94"} Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.832667 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.988596 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhpbw\" (UniqueName: \"kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw\") pod \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.988712 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume\") pod \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.988754 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume\") pod \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\" (UID: \"52f3bf5b-6437-4d51-ab30-adef4a9f0753\") " Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.990123 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume" (OuterVolumeSpecName: "config-volume") pod "52f3bf5b-6437-4d51-ab30-adef4a9f0753" (UID: "52f3bf5b-6437-4d51-ab30-adef4a9f0753"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.994698 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw" (OuterVolumeSpecName: "kube-api-access-lhpbw") pod "52f3bf5b-6437-4d51-ab30-adef4a9f0753" (UID: "52f3bf5b-6437-4d51-ab30-adef4a9f0753"). InnerVolumeSpecName "kube-api-access-lhpbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:30:03 crc kubenswrapper[4895]: I1202 08:30:03.995957 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52f3bf5b-6437-4d51-ab30-adef4a9f0753" (UID: "52f3bf5b-6437-4d51-ab30-adef4a9f0753"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.090004 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52f3bf5b-6437-4d51-ab30-adef4a9f0753-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.090038 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52f3bf5b-6437-4d51-ab30-adef4a9f0753-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.090048 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhpbw\" (UniqueName: \"kubernetes.io/projected/52f3bf5b-6437-4d51-ab30-adef4a9f0753-kube-api-access-lhpbw\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.280146 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7"] Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.286314 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-s8zs7"] Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.583679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" event={"ID":"52f3bf5b-6437-4d51-ab30-adef4a9f0753","Type":"ContainerDied","Data":"055db3b723e2090cac0b2c2233a7a53d5e09c2ee983d80320996bb74cffc433a"} Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.583761 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="055db3b723e2090cac0b2c2233a7a53d5e09c2ee983d80320996bb74cffc433a" Dec 02 08:30:04 crc kubenswrapper[4895]: I1202 08:30:04.583772 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9" Dec 02 08:30:05 crc kubenswrapper[4895]: I1202 08:30:05.154109 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a909a9-9821-48da-8599-3162f92f4202" path="/var/lib/kubelet/pods/d5a909a9-9821-48da-8599-3162f92f4202/volumes" Dec 02 08:30:28 crc kubenswrapper[4895]: I1202 08:30:28.462846 4895 scope.go:117] "RemoveContainer" containerID="84bf831f6fe6915ec319cbaa86b8f1d2e40ea646bbe4a7dbb8934b9a4a1ff83e" Dec 02 08:31:05 crc kubenswrapper[4895]: I1202 08:31:05.474149 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:31:05 crc kubenswrapper[4895]: I1202 08:31:05.475107 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.185763 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:21 crc kubenswrapper[4895]: E1202 08:31:21.187809 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f3bf5b-6437-4d51-ab30-adef4a9f0753" containerName="collect-profiles" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.187948 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f3bf5b-6437-4d51-ab30-adef4a9f0753" containerName="collect-profiles" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.188272 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f3bf5b-6437-4d51-ab30-adef4a9f0753" containerName="collect-profiles" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.189726 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.201707 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.375551 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.375646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brftv\" (UniqueName: \"kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.375674 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.477281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.477370 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brftv\" (UniqueName: \"kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.477397 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.477885 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.477936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.495733 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brftv\" (UniqueName: \"kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv\") pod \"certified-operators-nfdmg\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:21 crc kubenswrapper[4895]: I1202 08:31:21.518841 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:22 crc kubenswrapper[4895]: I1202 08:31:22.052843 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:22 crc kubenswrapper[4895]: I1202 08:31:22.293603 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerStarted","Data":"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5"} Dec 02 08:31:22 crc kubenswrapper[4895]: I1202 08:31:22.294883 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerStarted","Data":"348d1ab313fc56ddca5e0d66cf8e63f9c10aba02a2a9df42f0639bcb1d8ecb8c"} Dec 02 08:31:23 crc kubenswrapper[4895]: I1202 08:31:23.307010 4895 generic.go:334] "Generic (PLEG): container finished" podID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerID="4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5" exitCode=0 Dec 02 08:31:23 crc kubenswrapper[4895]: I1202 08:31:23.307101 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerDied","Data":"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5"} Dec 02 08:31:24 crc kubenswrapper[4895]: I1202 08:31:24.319016 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerStarted","Data":"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78"} Dec 02 08:31:25 crc kubenswrapper[4895]: I1202 08:31:25.337951 4895 generic.go:334] "Generic (PLEG): container finished" podID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerID="a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78" exitCode=0 Dec 02 08:31:25 crc kubenswrapper[4895]: I1202 08:31:25.338027 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerDied","Data":"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78"} Dec 02 08:31:26 crc kubenswrapper[4895]: I1202 08:31:26.352556 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerStarted","Data":"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe"} Dec 02 08:31:26 crc kubenswrapper[4895]: I1202 08:31:26.381076 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nfdmg" podStartSLOduration=2.898075584 podStartE2EDuration="5.38105334s" podCreationTimestamp="2025-12-02 08:31:21 +0000 UTC" firstStartedPulling="2025-12-02 08:31:23.310379672 +0000 UTC m=+4094.481239296" lastFinishedPulling="2025-12-02 08:31:25.793357439 +0000 UTC m=+4096.964217052" observedRunningTime="2025-12-02 08:31:26.376920661 +0000 UTC m=+4097.547780304" watchObservedRunningTime="2025-12-02 08:31:26.38105334 +0000 UTC m=+4097.551912963" Dec 02 08:31:31 crc kubenswrapper[4895]: I1202 08:31:31.519725 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:31 crc kubenswrapper[4895]: I1202 08:31:31.520958 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:31 crc kubenswrapper[4895]: I1202 08:31:31.575394 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:32 crc kubenswrapper[4895]: I1202 08:31:32.917688 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:32 crc kubenswrapper[4895]: I1202 08:31:32.961030 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:34 crc kubenswrapper[4895]: I1202 08:31:34.898091 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nfdmg" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="registry-server" containerID="cri-o://d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe" gracePeriod=2 Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.280831 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.388703 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brftv\" (UniqueName: \"kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv\") pod \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.388801 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities\") pod \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.388888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content\") pod \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\" (UID: \"de6019d5-9e27-4eec-bdd4-c0092ae66e32\") " Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.390056 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities" (OuterVolumeSpecName: "utilities") pod "de6019d5-9e27-4eec-bdd4-c0092ae66e32" (UID: "de6019d5-9e27-4eec-bdd4-c0092ae66e32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.400579 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv" (OuterVolumeSpecName: "kube-api-access-brftv") pod "de6019d5-9e27-4eec-bdd4-c0092ae66e32" (UID: "de6019d5-9e27-4eec-bdd4-c0092ae66e32"). InnerVolumeSpecName "kube-api-access-brftv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.438702 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de6019d5-9e27-4eec-bdd4-c0092ae66e32" (UID: "de6019d5-9e27-4eec-bdd4-c0092ae66e32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.473705 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.473846 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.491183 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brftv\" (UniqueName: \"kubernetes.io/projected/de6019d5-9e27-4eec-bdd4-c0092ae66e32-kube-api-access-brftv\") on node \"crc\" DevicePath \"\"" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.491307 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.491330 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6019d5-9e27-4eec-bdd4-c0092ae66e32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.908586 4895 generic.go:334] "Generic (PLEG): container finished" podID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerID="d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe" exitCode=0 Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.908641 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerDied","Data":"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe"} Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.908701 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfdmg" event={"ID":"de6019d5-9e27-4eec-bdd4-c0092ae66e32","Type":"ContainerDied","Data":"348d1ab313fc56ddca5e0d66cf8e63f9c10aba02a2a9df42f0639bcb1d8ecb8c"} Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.908731 4895 scope.go:117] "RemoveContainer" containerID="d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.911166 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfdmg" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.939361 4895 scope.go:117] "RemoveContainer" containerID="a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78" Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.956680 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.966056 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nfdmg"] Dec 02 08:31:35 crc kubenswrapper[4895]: I1202 08:31:35.978941 4895 scope.go:117] "RemoveContainer" containerID="4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.007522 4895 scope.go:117] "RemoveContainer" containerID="d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe" Dec 02 08:31:36 crc kubenswrapper[4895]: E1202 08:31:36.008016 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe\": container with ID starting with d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe not found: ID does not exist" containerID="d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.008052 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe"} err="failed to get container status \"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe\": rpc error: code = NotFound desc = could not find container \"d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe\": container with ID starting with d79bc93932f4107cf0a0039a9d0a95ecc81271e7d46e4bf7df1dac05a0e7cabe not found: ID does not exist" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.008074 4895 scope.go:117] "RemoveContainer" containerID="a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78" Dec 02 08:31:36 crc kubenswrapper[4895]: E1202 08:31:36.008902 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78\": container with ID starting with a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78 not found: ID does not exist" containerID="a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.008953 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78"} err="failed to get container status \"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78\": rpc error: code = NotFound desc = could not find container \"a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78\": container with ID starting with a00f900943cd7581d012d7d08002a0113826fb3c9ddce6d2a28d836d93a51e78 not found: ID does not exist" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.008988 4895 scope.go:117] "RemoveContainer" containerID="4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5" Dec 02 08:31:36 crc kubenswrapper[4895]: E1202 08:31:36.009528 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5\": container with ID starting with 4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5 not found: ID does not exist" containerID="4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5" Dec 02 08:31:36 crc kubenswrapper[4895]: I1202 08:31:36.009963 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5"} err="failed to get container status \"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5\": rpc error: code = NotFound desc = could not find container \"4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5\": container with ID starting with 4702e42345e78c5223e9fccb3341bb67fcbf6d0691b7fe7b48f8d180b5ad5bf5 not found: ID does not exist" Dec 02 08:31:37 crc kubenswrapper[4895]: I1202 08:31:37.151856 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" path="/var/lib/kubelet/pods/de6019d5-9e27-4eec-bdd4-c0092ae66e32/volumes" Dec 02 08:32:05 crc kubenswrapper[4895]: I1202 08:32:05.473930 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:32:05 crc kubenswrapper[4895]: I1202 08:32:05.474538 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:32:05 crc kubenswrapper[4895]: I1202 08:32:05.474600 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:32:05 crc kubenswrapper[4895]: I1202 08:32:05.475260 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:32:05 crc kubenswrapper[4895]: I1202 08:32:05.475309 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648" gracePeriod=600 Dec 02 08:32:06 crc kubenswrapper[4895]: I1202 08:32:06.148462 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648" exitCode=0 Dec 02 08:32:06 crc kubenswrapper[4895]: I1202 08:32:06.148506 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648"} Dec 02 08:32:06 crc kubenswrapper[4895]: I1202 08:32:06.149710 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a"} Dec 02 08:32:06 crc kubenswrapper[4895]: I1202 08:32:06.149790 4895 scope.go:117] "RemoveContainer" containerID="85cc12b4c8ab8b07033c83c08e99e7c6dfb03217541e445f2b564ba95f1d74c3" Dec 02 08:34:05 crc kubenswrapper[4895]: I1202 08:34:05.474041 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:34:05 crc kubenswrapper[4895]: I1202 08:34:05.474889 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:34:35 crc kubenswrapper[4895]: I1202 08:34:35.473423 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:34:35 crc kubenswrapper[4895]: I1202 08:34:35.474133 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.474453 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.475104 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.475171 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.475997 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.476076 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" gracePeriod=600 Dec 02 08:35:05 crc kubenswrapper[4895]: E1202 08:35:05.607696 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.978557 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" exitCode=0 Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.978633 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a"} Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.978696 4895 scope.go:117] "RemoveContainer" containerID="f0330135d16cf0ecef762657edb6a8c1cd122703e741ebf6a219c8d1a213c648" Dec 02 08:35:05 crc kubenswrapper[4895]: I1202 08:35:05.980530 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:35:05 crc kubenswrapper[4895]: E1202 08:35:05.980891 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:35:17 crc kubenswrapper[4895]: I1202 08:35:17.141448 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:35:17 crc kubenswrapper[4895]: E1202 08:35:17.142355 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:35:30 crc kubenswrapper[4895]: I1202 08:35:30.140712 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:35:30 crc kubenswrapper[4895]: E1202 08:35:30.141627 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:35:44 crc kubenswrapper[4895]: I1202 08:35:44.140928 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:35:44 crc kubenswrapper[4895]: E1202 08:35:44.141711 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:35:58 crc kubenswrapper[4895]: I1202 08:35:58.141407 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:35:58 crc kubenswrapper[4895]: E1202 08:35:58.142581 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:36:12 crc kubenswrapper[4895]: I1202 08:36:12.141467 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:36:12 crc kubenswrapper[4895]: E1202 08:36:12.142233 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:36:27 crc kubenswrapper[4895]: I1202 08:36:27.140529 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:36:27 crc kubenswrapper[4895]: E1202 08:36:27.141216 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:36:39 crc kubenswrapper[4895]: I1202 08:36:39.145918 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:36:39 crc kubenswrapper[4895]: E1202 08:36:39.149369 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.074670 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:40 crc kubenswrapper[4895]: E1202 08:36:40.075082 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="extract-utilities" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.075108 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="extract-utilities" Dec 02 08:36:40 crc kubenswrapper[4895]: E1202 08:36:40.075125 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="registry-server" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.075133 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="registry-server" Dec 02 08:36:40 crc kubenswrapper[4895]: E1202 08:36:40.075155 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="extract-content" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.075166 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="extract-content" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.075389 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6019d5-9e27-4eec-bdd4-c0092ae66e32" containerName="registry-server" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.076826 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.103197 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.181134 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.181376 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.181448 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.283364 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.283423 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.283482 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.284287 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.284507 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.303161 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb\") pod \"redhat-operators-w599w\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.401033 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:40 crc kubenswrapper[4895]: I1202 08:36:40.826865 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:41 crc kubenswrapper[4895]: I1202 08:36:41.248271 4895 generic.go:334] "Generic (PLEG): container finished" podID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerID="319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29" exitCode=0 Dec 02 08:36:41 crc kubenswrapper[4895]: I1202 08:36:41.248325 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerDied","Data":"319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29"} Dec 02 08:36:41 crc kubenswrapper[4895]: I1202 08:36:41.248354 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerStarted","Data":"20f3e5bb5067fe7a278264b59663dd16517e2cda7dd1fe6ae37545f5ea611670"} Dec 02 08:36:41 crc kubenswrapper[4895]: I1202 08:36:41.250553 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:36:44 crc kubenswrapper[4895]: I1202 08:36:44.274086 4895 generic.go:334] "Generic (PLEG): container finished" podID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerID="3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace" exitCode=0 Dec 02 08:36:44 crc kubenswrapper[4895]: I1202 08:36:44.274181 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerDied","Data":"3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace"} Dec 02 08:36:45 crc kubenswrapper[4895]: I1202 08:36:45.285300 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerStarted","Data":"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e"} Dec 02 08:36:45 crc kubenswrapper[4895]: I1202 08:36:45.310484 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w599w" podStartSLOduration=1.833735185 podStartE2EDuration="5.310467254s" podCreationTimestamp="2025-12-02 08:36:40 +0000 UTC" firstStartedPulling="2025-12-02 08:36:41.250290296 +0000 UTC m=+4412.421149909" lastFinishedPulling="2025-12-02 08:36:44.727022375 +0000 UTC m=+4415.897881978" observedRunningTime="2025-12-02 08:36:45.305987065 +0000 UTC m=+4416.476846688" watchObservedRunningTime="2025-12-02 08:36:45.310467254 +0000 UTC m=+4416.481326867" Dec 02 08:36:50 crc kubenswrapper[4895]: I1202 08:36:50.401856 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:50 crc kubenswrapper[4895]: I1202 08:36:50.402217 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:50 crc kubenswrapper[4895]: I1202 08:36:50.444223 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:51 crc kubenswrapper[4895]: I1202 08:36:51.365945 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:51 crc kubenswrapper[4895]: I1202 08:36:51.414332 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:53 crc kubenswrapper[4895]: I1202 08:36:53.141879 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:36:53 crc kubenswrapper[4895]: E1202 08:36:53.142180 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:36:53 crc kubenswrapper[4895]: I1202 08:36:53.345456 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w599w" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="registry-server" containerID="cri-o://996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e" gracePeriod=2 Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.262681 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.356673 4895 generic.go:334] "Generic (PLEG): container finished" podID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerID="996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e" exitCode=0 Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.356718 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w599w" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.356752 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerDied","Data":"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e"} Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.357233 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w599w" event={"ID":"bd06d10f-6a00-4507-abbe-9f05710bc54f","Type":"ContainerDied","Data":"20f3e5bb5067fe7a278264b59663dd16517e2cda7dd1fe6ae37545f5ea611670"} Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.357254 4895 scope.go:117] "RemoveContainer" containerID="996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.376665 4895 scope.go:117] "RemoveContainer" containerID="3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.387667 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities\") pod \"bd06d10f-6a00-4507-abbe-9f05710bc54f\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.387924 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb\") pod \"bd06d10f-6a00-4507-abbe-9f05710bc54f\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.388077 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content\") pod \"bd06d10f-6a00-4507-abbe-9f05710bc54f\" (UID: \"bd06d10f-6a00-4507-abbe-9f05710bc54f\") " Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.388974 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities" (OuterVolumeSpecName: "utilities") pod "bd06d10f-6a00-4507-abbe-9f05710bc54f" (UID: "bd06d10f-6a00-4507-abbe-9f05710bc54f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.396013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb" (OuterVolumeSpecName: "kube-api-access-q69nb") pod "bd06d10f-6a00-4507-abbe-9f05710bc54f" (UID: "bd06d10f-6a00-4507-abbe-9f05710bc54f"). InnerVolumeSpecName "kube-api-access-q69nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.408033 4895 scope.go:117] "RemoveContainer" containerID="319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.448298 4895 scope.go:117] "RemoveContainer" containerID="996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e" Dec 02 08:36:54 crc kubenswrapper[4895]: E1202 08:36:54.448609 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e\": container with ID starting with 996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e not found: ID does not exist" containerID="996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.448640 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e"} err="failed to get container status \"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e\": rpc error: code = NotFound desc = could not find container \"996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e\": container with ID starting with 996a2f6f5c4ed518a0b635136863a35f36bf7d1469c128ff627970cb0a07f93e not found: ID does not exist" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.448661 4895 scope.go:117] "RemoveContainer" containerID="3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace" Dec 02 08:36:54 crc kubenswrapper[4895]: E1202 08:36:54.449069 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace\": container with ID starting with 3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace not found: ID does not exist" containerID="3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.449273 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace"} err="failed to get container status \"3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace\": rpc error: code = NotFound desc = could not find container \"3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace\": container with ID starting with 3147b4580779b01ffff627c98f5e9edb033c610ab81c695bf28f31aa303ccace not found: ID does not exist" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.449286 4895 scope.go:117] "RemoveContainer" containerID="319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29" Dec 02 08:36:54 crc kubenswrapper[4895]: E1202 08:36:54.449515 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29\": container with ID starting with 319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29 not found: ID does not exist" containerID="319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.449536 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29"} err="failed to get container status \"319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29\": rpc error: code = NotFound desc = could not find container \"319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29\": container with ID starting with 319acef4a2ecff6c1652ccea7993b009d9fd673879458210ee02ef52a5f37f29 not found: ID does not exist" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.489199 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q69nb\" (UniqueName: \"kubernetes.io/projected/bd06d10f-6a00-4507-abbe-9f05710bc54f-kube-api-access-q69nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:36:54 crc kubenswrapper[4895]: I1202 08:36:54.489236 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:36:55 crc kubenswrapper[4895]: I1202 08:36:55.250157 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd06d10f-6a00-4507-abbe-9f05710bc54f" (UID: "bd06d10f-6a00-4507-abbe-9f05710bc54f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:36:55 crc kubenswrapper[4895]: I1202 08:36:55.301383 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:55 crc kubenswrapper[4895]: I1202 08:36:55.303070 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd06d10f-6a00-4507-abbe-9f05710bc54f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:36:55 crc kubenswrapper[4895]: I1202 08:36:55.307808 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w599w"] Dec 02 08:36:57 crc kubenswrapper[4895]: I1202 08:36:57.150937 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" path="/var/lib/kubelet/pods/bd06d10f-6a00-4507-abbe-9f05710bc54f/volumes" Dec 02 08:37:06 crc kubenswrapper[4895]: I1202 08:37:06.140800 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:37:06 crc kubenswrapper[4895]: E1202 08:37:06.141587 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:37:21 crc kubenswrapper[4895]: I1202 08:37:21.140951 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:37:21 crc kubenswrapper[4895]: E1202 08:37:21.141834 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:37:32 crc kubenswrapper[4895]: I1202 08:37:32.141456 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:37:32 crc kubenswrapper[4895]: E1202 08:37:32.142086 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:37:44 crc kubenswrapper[4895]: I1202 08:37:44.140824 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:37:44 crc kubenswrapper[4895]: E1202 08:37:44.141591 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:37:59 crc kubenswrapper[4895]: I1202 08:37:59.147175 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:37:59 crc kubenswrapper[4895]: E1202 08:37:59.148197 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:38:12 crc kubenswrapper[4895]: I1202 08:38:12.141314 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:38:12 crc kubenswrapper[4895]: E1202 08:38:12.142104 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:38:24 crc kubenswrapper[4895]: I1202 08:38:24.140937 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:38:24 crc kubenswrapper[4895]: E1202 08:38:24.141530 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:38:35 crc kubenswrapper[4895]: I1202 08:38:35.142474 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:38:35 crc kubenswrapper[4895]: E1202 08:38:35.143135 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:38:50 crc kubenswrapper[4895]: I1202 08:38:50.141637 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:38:50 crc kubenswrapper[4895]: E1202 08:38:50.142499 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:04 crc kubenswrapper[4895]: I1202 08:39:04.140903 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:39:04 crc kubenswrapper[4895]: E1202 08:39:04.141675 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:16 crc kubenswrapper[4895]: I1202 08:39:16.141269 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:39:16 crc kubenswrapper[4895]: E1202 08:39:16.142120 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:31 crc kubenswrapper[4895]: I1202 08:39:31.140991 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:39:31 crc kubenswrapper[4895]: E1202 08:39:31.141862 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:44 crc kubenswrapper[4895]: I1202 08:39:44.140636 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:39:44 crc kubenswrapper[4895]: E1202 08:39:44.141228 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.474406 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:39:54 crc kubenswrapper[4895]: E1202 08:39:54.475359 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="registry-server" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.475375 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="registry-server" Dec 02 08:39:54 crc kubenswrapper[4895]: E1202 08:39:54.475408 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="extract-content" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.475417 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="extract-content" Dec 02 08:39:54 crc kubenswrapper[4895]: E1202 08:39:54.475434 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="extract-utilities" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.475445 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="extract-utilities" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.480137 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd06d10f-6a00-4507-abbe-9f05710bc54f" containerName="registry-server" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.481531 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.482810 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.572288 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklzk\" (UniqueName: \"kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.572334 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.572381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.673214 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.673635 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pklzk\" (UniqueName: \"kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.673663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.673851 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.674053 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.702981 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklzk\" (UniqueName: \"kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk\") pod \"redhat-marketplace-xhnzx\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:54 crc kubenswrapper[4895]: I1202 08:39:54.800833 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:39:55 crc kubenswrapper[4895]: I1202 08:39:55.140940 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:39:55 crc kubenswrapper[4895]: E1202 08:39:55.141445 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:39:55 crc kubenswrapper[4895]: I1202 08:39:55.312368 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:39:55 crc kubenswrapper[4895]: W1202 08:39:55.317909 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4f36a39_319e_4610_b26c_1287b4334f41.slice/crio-8a58bea8a54a30f2abbd1e6a758c2ab740f060f9f8ef4721fdd0c29c47b18123 WatchSource:0}: Error finding container 8a58bea8a54a30f2abbd1e6a758c2ab740f060f9f8ef4721fdd0c29c47b18123: Status 404 returned error can't find the container with id 8a58bea8a54a30f2abbd1e6a758c2ab740f060f9f8ef4721fdd0c29c47b18123 Dec 02 08:39:55 crc kubenswrapper[4895]: I1202 08:39:55.967859 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f36a39-319e-4610-b26c-1287b4334f41" containerID="3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4" exitCode=0 Dec 02 08:39:55 crc kubenswrapper[4895]: I1202 08:39:55.967971 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerDied","Data":"3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4"} Dec 02 08:39:55 crc kubenswrapper[4895]: I1202 08:39:55.968304 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerStarted","Data":"8a58bea8a54a30f2abbd1e6a758c2ab740f060f9f8ef4721fdd0c29c47b18123"} Dec 02 08:39:56 crc kubenswrapper[4895]: I1202 08:39:56.978375 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f36a39-319e-4610-b26c-1287b4334f41" containerID="f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e" exitCode=0 Dec 02 08:39:56 crc kubenswrapper[4895]: I1202 08:39:56.978429 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerDied","Data":"f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e"} Dec 02 08:39:58 crc kubenswrapper[4895]: I1202 08:39:58.995792 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerStarted","Data":"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8"} Dec 02 08:39:59 crc kubenswrapper[4895]: I1202 08:39:59.023590 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhnzx" podStartSLOduration=3.144609374 podStartE2EDuration="5.023574281s" podCreationTimestamp="2025-12-02 08:39:54 +0000 UTC" firstStartedPulling="2025-12-02 08:39:55.970733236 +0000 UTC m=+4607.141592849" lastFinishedPulling="2025-12-02 08:39:57.849698133 +0000 UTC m=+4609.020557756" observedRunningTime="2025-12-02 08:39:59.018983309 +0000 UTC m=+4610.189842942" watchObservedRunningTime="2025-12-02 08:39:59.023574281 +0000 UTC m=+4610.194433894" Dec 02 08:39:59 crc kubenswrapper[4895]: I1202 08:39:59.987840 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-xbsj8"] Dec 02 08:39:59 crc kubenswrapper[4895]: I1202 08:39:59.992253 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-xbsj8"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.097954 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-s9sdp"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.099072 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.101469 4895 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pcv42" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.101496 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.101504 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.101504 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.117502 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s9sdp"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.254021 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.254880 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcxd\" (UniqueName: \"kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.255228 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.356267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.356308 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcxd\" (UniqueName: \"kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.356361 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.356660 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.357342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.374299 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcxd\" (UniqueName: \"kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd\") pod \"crc-storage-crc-s9sdp\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.416436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.835719 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-s9sdp"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.865264 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.867013 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.881000 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.967282 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.967367 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z47p\" (UniqueName: \"kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:00 crc kubenswrapper[4895]: I1202 08:40:00.967389 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.068358 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z47p\" (UniqueName: \"kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.068420 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.068858 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.069260 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.069292 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.150574 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z47p\" (UniqueName: \"kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p\") pod \"community-operators-gq5nd\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.151620 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e33d18d-577d-4610-9653-d3acf0bd9578" path="/var/lib/kubelet/pods/1e33d18d-577d-4610-9653-d3acf0bd9578/volumes" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.209008 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:01 crc kubenswrapper[4895]: I1202 08:40:01.470891 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:02 crc kubenswrapper[4895]: I1202 08:40:02.019008 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s9sdp" event={"ID":"c4d1531f-c7c0-42b8-8b39-50b0f280079f","Type":"ContainerStarted","Data":"f9b36d8fd882938d93304b065a9a61dea8e16adc20777a08d824135c94c6f752"} Dec 02 08:40:02 crc kubenswrapper[4895]: I1202 08:40:02.020516 4895 generic.go:334] "Generic (PLEG): container finished" podID="afa99875-6279-406c-a5e2-1023624e80d9" containerID="5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903" exitCode=0 Dec 02 08:40:02 crc kubenswrapper[4895]: I1202 08:40:02.020546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerDied","Data":"5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903"} Dec 02 08:40:02 crc kubenswrapper[4895]: I1202 08:40:02.020562 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerStarted","Data":"b7871390fc6dbe2ba1e69944ae9cbc0e517e7fb2cd6c8ddbefdfab03e2c69b6b"} Dec 02 08:40:03 crc kubenswrapper[4895]: I1202 08:40:03.029367 4895 generic.go:334] "Generic (PLEG): container finished" podID="c4d1531f-c7c0-42b8-8b39-50b0f280079f" containerID="0ebdb8aa2004d72423d7de16f89e0c5be4bb4adf06c82756f17f7df43fafee24" exitCode=0 Dec 02 08:40:03 crc kubenswrapper[4895]: I1202 08:40:03.029417 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s9sdp" event={"ID":"c4d1531f-c7c0-42b8-8b39-50b0f280079f","Type":"ContainerDied","Data":"0ebdb8aa2004d72423d7de16f89e0c5be4bb4adf06c82756f17f7df43fafee24"} Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.317260 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.414847 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbcxd\" (UniqueName: \"kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd\") pod \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.414941 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage\") pod \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.415013 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt\") pod \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\" (UID: \"c4d1531f-c7c0-42b8-8b39-50b0f280079f\") " Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.415214 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c4d1531f-c7c0-42b8-8b39-50b0f280079f" (UID: "c4d1531f-c7c0-42b8-8b39-50b0f280079f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.415437 4895 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c4d1531f-c7c0-42b8-8b39-50b0f280079f-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.428657 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd" (OuterVolumeSpecName: "kube-api-access-hbcxd") pod "c4d1531f-c7c0-42b8-8b39-50b0f280079f" (UID: "c4d1531f-c7c0-42b8-8b39-50b0f280079f"). InnerVolumeSpecName "kube-api-access-hbcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.438966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c4d1531f-c7c0-42b8-8b39-50b0f280079f" (UID: "c4d1531f-c7c0-42b8-8b39-50b0f280079f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.516683 4895 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c4d1531f-c7c0-42b8-8b39-50b0f280079f-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.516714 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbcxd\" (UniqueName: \"kubernetes.io/projected/c4d1531f-c7c0-42b8-8b39-50b0f280079f-kube-api-access-hbcxd\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.801622 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.802097 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:04 crc kubenswrapper[4895]: I1202 08:40:04.841372 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.047981 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-s9sdp" Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.047997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-s9sdp" event={"ID":"c4d1531f-c7c0-42b8-8b39-50b0f280079f","Type":"ContainerDied","Data":"f9b36d8fd882938d93304b065a9a61dea8e16adc20777a08d824135c94c6f752"} Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.048032 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b36d8fd882938d93304b065a9a61dea8e16adc20777a08d824135c94c6f752" Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.050224 4895 generic.go:334] "Generic (PLEG): container finished" podID="afa99875-6279-406c-a5e2-1023624e80d9" containerID="aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0" exitCode=0 Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.050272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerDied","Data":"aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0"} Dec 02 08:40:05 crc kubenswrapper[4895]: I1202 08:40:05.092959 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.074229 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerStarted","Data":"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7"} Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.096120 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gq5nd" podStartSLOduration=2.600475435 podStartE2EDuration="6.096104842s" podCreationTimestamp="2025-12-02 08:40:00 +0000 UTC" firstStartedPulling="2025-12-02 08:40:02.021799088 +0000 UTC m=+4613.192658701" lastFinishedPulling="2025-12-02 08:40:05.517428495 +0000 UTC m=+4616.688288108" observedRunningTime="2025-12-02 08:40:06.091642513 +0000 UTC m=+4617.262502126" watchObservedRunningTime="2025-12-02 08:40:06.096104842 +0000 UTC m=+4617.266964455" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.538590 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-s9sdp"] Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.544920 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-s9sdp"] Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.701665 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-fkx6r"] Dec 02 08:40:06 crc kubenswrapper[4895]: E1202 08:40:06.702041 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d1531f-c7c0-42b8-8b39-50b0f280079f" containerName="storage" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.702058 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d1531f-c7c0-42b8-8b39-50b0f280079f" containerName="storage" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.702196 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d1531f-c7c0-42b8-8b39-50b0f280079f" containerName="storage" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.702673 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.706247 4895 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pcv42" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.707083 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.707080 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.707114 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.714072 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fkx6r"] Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.858453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.858646 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7c8p\" (UniqueName: \"kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.858682 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.959910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7c8p\" (UniqueName: \"kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.959971 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.960040 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.960424 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.960976 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:06 crc kubenswrapper[4895]: I1202 08:40:06.979867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7c8p\" (UniqueName: \"kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p\") pod \"crc-storage-crc-fkx6r\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:07 crc kubenswrapper[4895]: I1202 08:40:07.026818 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:07 crc kubenswrapper[4895]: I1202 08:40:07.142877 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:40:07 crc kubenswrapper[4895]: I1202 08:40:07.159998 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d1531f-c7c0-42b8-8b39-50b0f280079f" path="/var/lib/kubelet/pods/c4d1531f-c7c0-42b8-8b39-50b0f280079f/volumes" Dec 02 08:40:07 crc kubenswrapper[4895]: I1202 08:40:07.444910 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fkx6r"] Dec 02 08:40:07 crc kubenswrapper[4895]: I1202 08:40:07.455397 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.086871 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fkx6r" event={"ID":"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3","Type":"ContainerStarted","Data":"02e8aa2ee187efce3087e6e00b02eb0327a658e4c31d9f21bd421265a1aa6f5b"} Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.091323 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85"} Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.091450 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xhnzx" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="registry-server" containerID="cri-o://9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8" gracePeriod=2 Dec 02 08:40:08 crc kubenswrapper[4895]: E1202 08:40:08.273730 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4f36a39_319e_4610_b26c_1287b4334f41.slice/crio-9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4f36a39_319e_4610_b26c_1287b4334f41.slice/crio-conmon-9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.479400 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.581516 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content\") pod \"a4f36a39-319e-4610-b26c-1287b4334f41\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.581624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pklzk\" (UniqueName: \"kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk\") pod \"a4f36a39-319e-4610-b26c-1287b4334f41\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.581705 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities\") pod \"a4f36a39-319e-4610-b26c-1287b4334f41\" (UID: \"a4f36a39-319e-4610-b26c-1287b4334f41\") " Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.582967 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities" (OuterVolumeSpecName: "utilities") pod "a4f36a39-319e-4610-b26c-1287b4334f41" (UID: "a4f36a39-319e-4610-b26c-1287b4334f41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.595095 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk" (OuterVolumeSpecName: "kube-api-access-pklzk") pod "a4f36a39-319e-4610-b26c-1287b4334f41" (UID: "a4f36a39-319e-4610-b26c-1287b4334f41"). InnerVolumeSpecName "kube-api-access-pklzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.601205 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4f36a39-319e-4610-b26c-1287b4334f41" (UID: "a4f36a39-319e-4610-b26c-1287b4334f41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.682832 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pklzk\" (UniqueName: \"kubernetes.io/projected/a4f36a39-319e-4610-b26c-1287b4334f41-kube-api-access-pklzk\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.682860 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:08 crc kubenswrapper[4895]: I1202 08:40:08.682870 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f36a39-319e-4610-b26c-1287b4334f41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.099843 4895 generic.go:334] "Generic (PLEG): container finished" podID="43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" containerID="e5418c00bf050fca001742068d75b484a8a97c0f1ddc8bc711fcfdae8af0404f" exitCode=0 Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.099895 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fkx6r" event={"ID":"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3","Type":"ContainerDied","Data":"e5418c00bf050fca001742068d75b484a8a97c0f1ddc8bc711fcfdae8af0404f"} Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.103330 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f36a39-319e-4610-b26c-1287b4334f41" containerID="9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8" exitCode=0 Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.103380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerDied","Data":"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8"} Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.103405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhnzx" event={"ID":"a4f36a39-319e-4610-b26c-1287b4334f41","Type":"ContainerDied","Data":"8a58bea8a54a30f2abbd1e6a758c2ab740f060f9f8ef4721fdd0c29c47b18123"} Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.103415 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhnzx" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.103421 4895 scope.go:117] "RemoveContainer" containerID="9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.132084 4895 scope.go:117] "RemoveContainer" containerID="f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.172854 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.173117 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhnzx"] Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.567385 4895 scope.go:117] "RemoveContainer" containerID="3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.597454 4895 scope.go:117] "RemoveContainer" containerID="9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8" Dec 02 08:40:09 crc kubenswrapper[4895]: E1202 08:40:09.598373 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8\": container with ID starting with 9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8 not found: ID does not exist" containerID="9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.598490 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8"} err="failed to get container status \"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8\": rpc error: code = NotFound desc = could not find container \"9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8\": container with ID starting with 9c986d1dad7dc627ede095382f3112525773e314d2805152c45fa9df8a3e70b8 not found: ID does not exist" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.598577 4895 scope.go:117] "RemoveContainer" containerID="f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e" Dec 02 08:40:09 crc kubenswrapper[4895]: E1202 08:40:09.599132 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e\": container with ID starting with f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e not found: ID does not exist" containerID="f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.599188 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e"} err="failed to get container status \"f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e\": rpc error: code = NotFound desc = could not find container \"f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e\": container with ID starting with f4f0b39e7bbfc983f9cd22a7aa46de3e28fd835e0f25f927b8ce5c6ad02e6d2e not found: ID does not exist" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.599220 4895 scope.go:117] "RemoveContainer" containerID="3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4" Dec 02 08:40:09 crc kubenswrapper[4895]: E1202 08:40:09.599822 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4\": container with ID starting with 3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4 not found: ID does not exist" containerID="3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4" Dec 02 08:40:09 crc kubenswrapper[4895]: I1202 08:40:09.599950 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4"} err="failed to get container status \"3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4\": rpc error: code = NotFound desc = could not find container \"3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4\": container with ID starting with 3e2e3e8782f9f25f3c029505e766cf513b349da0963c87519445d597ed3809c4 not found: ID does not exist" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.400661 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.505474 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage\") pod \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.505560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt\") pod \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.505803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" (UID: "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.506381 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7c8p\" (UniqueName: \"kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p\") pod \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\" (UID: \"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3\") " Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.506769 4895 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.514720 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p" (OuterVolumeSpecName: "kube-api-access-m7c8p") pod "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" (UID: "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3"). InnerVolumeSpecName "kube-api-access-m7c8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.536722 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" (UID: "43971b5a-f4a8-4e61-b832-fcf5ff66ffb3"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.608116 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7c8p\" (UniqueName: \"kubernetes.io/projected/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-kube-api-access-m7c8p\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:10 crc kubenswrapper[4895]: I1202 08:40:10.608152 4895 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/43971b5a-f4a8-4e61-b832-fcf5ff66ffb3-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.119024 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fkx6r" event={"ID":"43971b5a-f4a8-4e61-b832-fcf5ff66ffb3","Type":"ContainerDied","Data":"02e8aa2ee187efce3087e6e00b02eb0327a658e4c31d9f21bd421265a1aa6f5b"} Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.119593 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02e8aa2ee187efce3087e6e00b02eb0327a658e4c31d9f21bd421265a1aa6f5b" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.119569 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fkx6r" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.150419 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" path="/var/lib/kubelet/pods/a4f36a39-319e-4610-b26c-1287b4334f41/volumes" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.210026 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.210083 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:11 crc kubenswrapper[4895]: I1202 08:40:11.250441 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:12 crc kubenswrapper[4895]: I1202 08:40:12.178963 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:13 crc kubenswrapper[4895]: I1202 08:40:13.250921 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.142068 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gq5nd" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="registry-server" containerID="cri-o://fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7" gracePeriod=2 Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.533559 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.675591 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities\") pod \"afa99875-6279-406c-a5e2-1023624e80d9\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.675992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z47p\" (UniqueName: \"kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p\") pod \"afa99875-6279-406c-a5e2-1023624e80d9\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.676025 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content\") pod \"afa99875-6279-406c-a5e2-1023624e80d9\" (UID: \"afa99875-6279-406c-a5e2-1023624e80d9\") " Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.676807 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities" (OuterVolumeSpecName: "utilities") pod "afa99875-6279-406c-a5e2-1023624e80d9" (UID: "afa99875-6279-406c-a5e2-1023624e80d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.681856 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p" (OuterVolumeSpecName: "kube-api-access-9z47p") pod "afa99875-6279-406c-a5e2-1023624e80d9" (UID: "afa99875-6279-406c-a5e2-1023624e80d9"). InnerVolumeSpecName "kube-api-access-9z47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.778534 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.778590 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z47p\" (UniqueName: \"kubernetes.io/projected/afa99875-6279-406c-a5e2-1023624e80d9-kube-api-access-9z47p\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.927476 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afa99875-6279-406c-a5e2-1023624e80d9" (UID: "afa99875-6279-406c-a5e2-1023624e80d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:40:14 crc kubenswrapper[4895]: I1202 08:40:14.981620 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa99875-6279-406c-a5e2-1023624e80d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.154454 4895 generic.go:334] "Generic (PLEG): container finished" podID="afa99875-6279-406c-a5e2-1023624e80d9" containerID="fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7" exitCode=0 Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.154503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerDied","Data":"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7"} Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.154547 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gq5nd" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.154881 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gq5nd" event={"ID":"afa99875-6279-406c-a5e2-1023624e80d9","Type":"ContainerDied","Data":"b7871390fc6dbe2ba1e69944ae9cbc0e517e7fb2cd6c8ddbefdfab03e2c69b6b"} Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.154973 4895 scope.go:117] "RemoveContainer" containerID="fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.175150 4895 scope.go:117] "RemoveContainer" containerID="aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.205783 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.213099 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gq5nd"] Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.219524 4895 scope.go:117] "RemoveContainer" containerID="5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.234591 4895 scope.go:117] "RemoveContainer" containerID="fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7" Dec 02 08:40:15 crc kubenswrapper[4895]: E1202 08:40:15.235216 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7\": container with ID starting with fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7 not found: ID does not exist" containerID="fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.235277 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7"} err="failed to get container status \"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7\": rpc error: code = NotFound desc = could not find container \"fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7\": container with ID starting with fd51cb3e304793b041a38702f8dcc4bac555ed4c2e288a6ac2da945739b5eda7 not found: ID does not exist" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.235317 4895 scope.go:117] "RemoveContainer" containerID="aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0" Dec 02 08:40:15 crc kubenswrapper[4895]: E1202 08:40:15.235699 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0\": container with ID starting with aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0 not found: ID does not exist" containerID="aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.235787 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0"} err="failed to get container status \"aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0\": rpc error: code = NotFound desc = could not find container \"aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0\": container with ID starting with aa88c226c4937b1d2c61213697669e4c8f366b630a31c6a3b8f0c85e0fe676c0 not found: ID does not exist" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.235820 4895 scope.go:117] "RemoveContainer" containerID="5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903" Dec 02 08:40:15 crc kubenswrapper[4895]: E1202 08:40:15.236154 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903\": container with ID starting with 5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903 not found: ID does not exist" containerID="5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903" Dec 02 08:40:15 crc kubenswrapper[4895]: I1202 08:40:15.236198 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903"} err="failed to get container status \"5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903\": rpc error: code = NotFound desc = could not find container \"5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903\": container with ID starting with 5591f0f56a5becdc0541aae40da6ed31d8dc645cc43fb7b9d7a617bee29b4903 not found: ID does not exist" Dec 02 08:40:17 crc kubenswrapper[4895]: I1202 08:40:17.149272 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa99875-6279-406c-a5e2-1023624e80d9" path="/var/lib/kubelet/pods/afa99875-6279-406c-a5e2-1023624e80d9/volumes" Dec 02 08:40:28 crc kubenswrapper[4895]: I1202 08:40:28.707479 4895 scope.go:117] "RemoveContainer" containerID="0744137f31f20b6d47db9ff1933beb8d7bdd7b25221e6e5d3687702c5553e529" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.271493 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272300 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272312 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272334 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="extract-content" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272340 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="extract-content" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272347 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="extract-content" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272355 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="extract-content" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272366 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="extract-utilities" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272371 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="extract-utilities" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272382 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="extract-utilities" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272387 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="extract-utilities" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272403 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" containerName="storage" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272409 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" containerName="storage" Dec 02 08:41:23 crc kubenswrapper[4895]: E1202 08:41:23.272421 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272426 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272554 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f36a39-319e-4610-b26c-1287b4334f41" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272568 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa99875-6279-406c-a5e2-1023624e80d9" containerName="registry-server" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.272582 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="43971b5a-f4a8-4e61-b832-fcf5ff66ffb3" containerName="storage" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.273596 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.290445 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.429764 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.429928 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.430061 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.531352 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.531457 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.531508 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.532172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.532314 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.552540 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv\") pod \"certified-operators-4t6fm\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.596839 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:23 crc kubenswrapper[4895]: I1202 08:41:23.897856 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:24 crc kubenswrapper[4895]: I1202 08:41:24.818968 4895 generic.go:334] "Generic (PLEG): container finished" podID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerID="ab5c56c2c29f70ce6843ca474c9a97be6082d426ccfda2b332016af6c7803b14" exitCode=0 Dec 02 08:41:24 crc kubenswrapper[4895]: I1202 08:41:24.819031 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerDied","Data":"ab5c56c2c29f70ce6843ca474c9a97be6082d426ccfda2b332016af6c7803b14"} Dec 02 08:41:24 crc kubenswrapper[4895]: I1202 08:41:24.819302 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerStarted","Data":"4454c40f18fc09889d3aba8cfa255fccee8710abf163b7831c3b3b99101b683d"} Dec 02 08:41:26 crc kubenswrapper[4895]: I1202 08:41:26.833838 4895 generic.go:334] "Generic (PLEG): container finished" podID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerID="799d9dee9b8547a758bb021b9a02e6f4f5d13da0dce0600a58716cb1a2d9ddc1" exitCode=0 Dec 02 08:41:26 crc kubenswrapper[4895]: I1202 08:41:26.833916 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerDied","Data":"799d9dee9b8547a758bb021b9a02e6f4f5d13da0dce0600a58716cb1a2d9ddc1"} Dec 02 08:41:29 crc kubenswrapper[4895]: I1202 08:41:29.855654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerStarted","Data":"b913d2ca27ef54abb0294c98efa7d70e8201e83276e708cceca78c50b30a76ab"} Dec 02 08:41:29 crc kubenswrapper[4895]: I1202 08:41:29.874506 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4t6fm" podStartSLOduration=3.05463707 podStartE2EDuration="6.874483599s" podCreationTimestamp="2025-12-02 08:41:23 +0000 UTC" firstStartedPulling="2025-12-02 08:41:24.820779707 +0000 UTC m=+4695.991639320" lastFinishedPulling="2025-12-02 08:41:28.640626236 +0000 UTC m=+4699.811485849" observedRunningTime="2025-12-02 08:41:29.872393114 +0000 UTC m=+4701.043252747" watchObservedRunningTime="2025-12-02 08:41:29.874483599 +0000 UTC m=+4701.045343232" Dec 02 08:41:33 crc kubenswrapper[4895]: I1202 08:41:33.597173 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:33 crc kubenswrapper[4895]: I1202 08:41:33.789216 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:33 crc kubenswrapper[4895]: I1202 08:41:33.830795 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:33 crc kubenswrapper[4895]: I1202 08:41:33.924456 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:34 crc kubenswrapper[4895]: I1202 08:41:34.071384 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:35 crc kubenswrapper[4895]: I1202 08:41:35.896926 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4t6fm" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="registry-server" containerID="cri-o://b913d2ca27ef54abb0294c98efa7d70e8201e83276e708cceca78c50b30a76ab" gracePeriod=2 Dec 02 08:41:36 crc kubenswrapper[4895]: I1202 08:41:36.908766 4895 generic.go:334] "Generic (PLEG): container finished" podID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerID="b913d2ca27ef54abb0294c98efa7d70e8201e83276e708cceca78c50b30a76ab" exitCode=0 Dec 02 08:41:36 crc kubenswrapper[4895]: I1202 08:41:36.908803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerDied","Data":"b913d2ca27ef54abb0294c98efa7d70e8201e83276e708cceca78c50b30a76ab"} Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.105353 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.143818 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content\") pod \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.143882 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities\") pod \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.143902 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv\") pod \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\" (UID: \"a96dabb0-65af-43a3-9b7f-1c5f99e0135b\") " Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.144914 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities" (OuterVolumeSpecName: "utilities") pod "a96dabb0-65af-43a3-9b7f-1c5f99e0135b" (UID: "a96dabb0-65af-43a3-9b7f-1c5f99e0135b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.154203 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv" (OuterVolumeSpecName: "kube-api-access-d9mqv") pod "a96dabb0-65af-43a3-9b7f-1c5f99e0135b" (UID: "a96dabb0-65af-43a3-9b7f-1c5f99e0135b"). InnerVolumeSpecName "kube-api-access-d9mqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.194398 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a96dabb0-65af-43a3-9b7f-1c5f99e0135b" (UID: "a96dabb0-65af-43a3-9b7f-1c5f99e0135b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.245168 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.245204 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-kube-api-access-d9mqv\") on node \"crc\" DevicePath \"\"" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.245215 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a96dabb0-65af-43a3-9b7f-1c5f99e0135b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.922018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t6fm" event={"ID":"a96dabb0-65af-43a3-9b7f-1c5f99e0135b","Type":"ContainerDied","Data":"4454c40f18fc09889d3aba8cfa255fccee8710abf163b7831c3b3b99101b683d"} Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.922079 4895 scope.go:117] "RemoveContainer" containerID="b913d2ca27ef54abb0294c98efa7d70e8201e83276e708cceca78c50b30a76ab" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.922094 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t6fm" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.950926 4895 scope.go:117] "RemoveContainer" containerID="799d9dee9b8547a758bb021b9a02e6f4f5d13da0dce0600a58716cb1a2d9ddc1" Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.957546 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.964610 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4t6fm"] Dec 02 08:41:37 crc kubenswrapper[4895]: I1202 08:41:37.973576 4895 scope.go:117] "RemoveContainer" containerID="ab5c56c2c29f70ce6843ca474c9a97be6082d426ccfda2b332016af6c7803b14" Dec 02 08:41:39 crc kubenswrapper[4895]: I1202 08:41:39.152009 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" path="/var/lib/kubelet/pods/a96dabb0-65af-43a3-9b7f-1c5f99e0135b/volumes" Dec 02 08:42:35 crc kubenswrapper[4895]: I1202 08:42:35.473262 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:42:35 crc kubenswrapper[4895]: I1202 08:42:35.474188 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:43:05 crc kubenswrapper[4895]: I1202 08:43:05.473336 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:43:05 crc kubenswrapper[4895]: I1202 08:43:05.474006 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.016139 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:16 crc kubenswrapper[4895]: E1202 08:43:16.016912 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="registry-server" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.016925 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="registry-server" Dec 02 08:43:16 crc kubenswrapper[4895]: E1202 08:43:16.016937 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="extract-utilities" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.016943 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="extract-utilities" Dec 02 08:43:16 crc kubenswrapper[4895]: E1202 08:43:16.016955 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="extract-content" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.016962 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="extract-content" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.017121 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a96dabb0-65af-43a3-9b7f-1c5f99e0135b" containerName="registry-server" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.017873 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.020327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bt2xd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.020493 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.020560 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.020655 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.020686 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.031236 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.125396 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.125824 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njr2h\" (UniqueName: \"kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.125849 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.228612 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.228936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.229082 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njr2h\" (UniqueName: \"kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.229543 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.230172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.260487 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njr2h\" (UniqueName: \"kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h\") pod \"dnsmasq-dns-5d7b5456f5-5nlvc\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.337115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.450049 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.451936 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.482151 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.552637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.552993 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.553155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.656557 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.657000 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.657095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.658668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.661357 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.697422 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt\") pod \"dnsmasq-dns-98ddfc8f-vcggd\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.763458 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:16 crc kubenswrapper[4895]: I1202 08:43:16.779564 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.207497 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.209161 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.212579 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.212794 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.212818 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7cgr2" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.213018 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.216753 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.221107 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.264058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.264370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.264873 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265300 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265506 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265633 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9lfp\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265774 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.265881 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.270956 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367220 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367581 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367620 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367640 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367693 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367729 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367761 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9lfp\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.367782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.368623 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.369169 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.369295 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.369518 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.372421 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.372431 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.373453 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.376686 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.376749 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bc43ae5c44dd4ed501f2535978d3baf80a461f20cdd11d3425cd6ff237251cb7/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.389004 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9lfp\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.427422 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.533120 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.614953 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.617209 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.620321 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.620341 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mhv2p" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.620341 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.620467 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.621384 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.629950 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673273 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673650 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673777 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673813 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvx2\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673886 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.673912 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.684540 4895 generic.go:334] "Generic (PLEG): container finished" podID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerID="352f5de4e4d4933767fcc536b93ce8208cf14541ac655d13aaafef6f6af0b208" exitCode=0 Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.684611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" event={"ID":"5f400aae-225f-4349-ad0c-c4e2cfcaf833","Type":"ContainerDied","Data":"352f5de4e4d4933767fcc536b93ce8208cf14541ac655d13aaafef6f6af0b208"} Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.684637 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" event={"ID":"5f400aae-225f-4349-ad0c-c4e2cfcaf833","Type":"ContainerStarted","Data":"762745d5bacdabac9d9cf2a17076818401865d2e247f6108e639a715dfcf73db"} Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.687561 4895 generic.go:334] "Generic (PLEG): container finished" podID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerID="df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7" exitCode=0 Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.687594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" event={"ID":"06719ca1-7571-4a00-ab0e-cbbf89f793e3","Type":"ContainerDied","Data":"df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7"} Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.687618 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" event={"ID":"06719ca1-7571-4a00-ab0e-cbbf89f793e3","Type":"ContainerStarted","Data":"f4415a5b8ca2530ba4ae63b5e89e9a89b23307bc59627fef7c3fab810dc47355"} Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775530 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775575 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvx2\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775692 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775786 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775840 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.775878 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.776255 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.776503 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.777339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.778527 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.780691 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.781289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.782196 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.782528 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.782582 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/11bea3178402b74d8565de09d958d81983ae748ac7ef3168967b3521a42ba14e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.782926 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.795814 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvx2\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.833718 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:17 crc kubenswrapper[4895]: E1202 08:43:17.868481 4895 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 08:43:17 crc kubenswrapper[4895]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5f400aae-225f-4349-ad0c-c4e2cfcaf833/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 08:43:17 crc kubenswrapper[4895]: > podSandboxID="762745d5bacdabac9d9cf2a17076818401865d2e247f6108e639a715dfcf73db" Dec 02 08:43:17 crc kubenswrapper[4895]: E1202 08:43:17.868668 4895 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 08:43:17 crc kubenswrapper[4895]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-5nlvc_openstack(5f400aae-225f-4349-ad0c-c4e2cfcaf833): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5f400aae-225f-4349-ad0c-c4e2cfcaf833/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 08:43:17 crc kubenswrapper[4895]: > logger="UnhandledError" Dec 02 08:43:17 crc kubenswrapper[4895]: E1202 08:43:17.869911 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5f400aae-225f-4349-ad0c-c4e2cfcaf833/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" Dec 02 08:43:17 crc kubenswrapper[4895]: I1202 08:43:17.935421 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.030654 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.176843 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:43:18 crc kubenswrapper[4895]: W1202 08:43:18.182520 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d3da5e_ef77_4e6c_8040_ef3000794d29.slice/crio-56990082aac0476a0e8d5ae7604014c1cfe2b5cd5ba762739a2ec9e517720507 WatchSource:0}: Error finding container 56990082aac0476a0e8d5ae7604014c1cfe2b5cd5ba762739a2ec9e517720507: Status 404 returned error can't find the container with id 56990082aac0476a0e8d5ae7604014c1cfe2b5cd5ba762739a2ec9e517720507 Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.620302 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.622041 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.624082 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.624109 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.624678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-s7x4h" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.630450 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.632816 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.633434 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.696627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerStarted","Data":"32b4fff28c31289beee34985ce94905335cf956fd78d6baf03487b88c87187e8"} Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.697938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" event={"ID":"06719ca1-7571-4a00-ab0e-cbbf89f793e3","Type":"ContainerStarted","Data":"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e"} Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.698918 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699131 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699174 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699210 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-config-data-default\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699242 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/340f7b33-817a-47bb-90f7-69a41144137d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699290 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xbdr\" (UniqueName: \"kubernetes.io/projected/340f7b33-817a-47bb-90f7-69a41144137d-kube-api-access-9xbdr\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699342 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699377 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699413 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-kolla-config\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.699508 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerStarted","Data":"56990082aac0476a0e8d5ae7604014c1cfe2b5cd5ba762739a2ec9e517720507"} Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.715522 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" podStartSLOduration=2.7155055900000002 podStartE2EDuration="2.71550559s" podCreationTimestamp="2025-12-02 08:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:18.715206601 +0000 UTC m=+4809.886066214" watchObservedRunningTime="2025-12-02 08:43:18.71550559 +0000 UTC m=+4809.886365213" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801554 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801583 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-config-data-default\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/340f7b33-817a-47bb-90f7-69a41144137d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801641 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xbdr\" (UniqueName: \"kubernetes.io/projected/340f7b33-817a-47bb-90f7-69a41144137d-kube-api-access-9xbdr\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801713 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801802 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.801832 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-kolla-config\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.802558 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-kolla-config\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.803340 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/340f7b33-817a-47bb-90f7-69a41144137d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.803533 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-config-data-default\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.806461 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.808387 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340f7b33-817a-47bb-90f7-69a41144137d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.817759 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/340f7b33-817a-47bb-90f7-69a41144137d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.829253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xbdr\" (UniqueName: \"kubernetes.io/projected/340f7b33-817a-47bb-90f7-69a41144137d-kube-api-access-9xbdr\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.833661 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:43:18 crc kubenswrapper[4895]: I1202 08:43:18.833715 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c29d44538f9451436ed0da65f59af3a673ee8878c4fe6d47837a73d4dd864a99/globalmount\"" pod="openstack/openstack-galera-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.109146 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.110182 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.112370 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-f5phh" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.112647 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.119733 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.185600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e00b0a2f-6d15-47f9-b679-70f7283525f6\") pod \"openstack-galera-0\" (UID: \"340f7b33-817a-47bb-90f7-69a41144137d\") " pod="openstack/openstack-galera-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.207575 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5tw2\" (UniqueName: \"kubernetes.io/projected/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kube-api-access-q5tw2\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.207689 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kolla-config\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.207711 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-config-data\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.294967 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.309036 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kolla-config\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.309086 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-config-data\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.309128 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5tw2\" (UniqueName: \"kubernetes.io/projected/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kube-api-access-q5tw2\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.310168 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kolla-config\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.310313 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-config-data\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.347856 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5tw2\" (UniqueName: \"kubernetes.io/projected/a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9-kube-api-access-q5tw2\") pod \"memcached-0\" (UID: \"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9\") " pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.428876 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.709060 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerStarted","Data":"6674e75dfd35f2994edea8ee60804b1e48a883232d70c5c9855dfbf5610bfa25"} Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.710815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerStarted","Data":"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c"} Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.713264 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" event={"ID":"5f400aae-225f-4349-ad0c-c4e2cfcaf833","Type":"ContainerStarted","Data":"b0a958345e4af3f2d1e5d20fc711fbdf4183afd4ada1b51690aba8078f78c414"} Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.713524 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.794071 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" podStartSLOduration=4.794043113 podStartE2EDuration="4.794043113s" podCreationTimestamp="2025-12-02 08:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:19.786513079 +0000 UTC m=+4810.957372692" watchObservedRunningTime="2025-12-02 08:43:19.794043113 +0000 UTC m=+4810.964902726" Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.871794 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:43:19 crc kubenswrapper[4895]: W1202 08:43:19.881847 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod340f7b33_817a_47bb_90f7_69a41144137d.slice/crio-a591ac04d75acbdadc5f015adb8fc98a542cc6162bfe367ed1e8b73594146ddf WatchSource:0}: Error finding container a591ac04d75acbdadc5f015adb8fc98a542cc6162bfe367ed1e8b73594146ddf: Status 404 returned error can't find the container with id a591ac04d75acbdadc5f015adb8fc98a542cc6162bfe367ed1e8b73594146ddf Dec 02 08:43:19 crc kubenswrapper[4895]: I1202 08:43:19.922298 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 08:43:19 crc kubenswrapper[4895]: W1202 08:43:19.926809 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4b2f1ac_db64_4f2e_8d51_8470c6d1e4f9.slice/crio-45cbdb8c1fc0ac941b23d659b0c17de557d9cfdf9554f297862888d0efb84d5e WatchSource:0}: Error finding container 45cbdb8c1fc0ac941b23d659b0c17de557d9cfdf9554f297862888d0efb84d5e: Status 404 returned error can't find the container with id 45cbdb8c1fc0ac941b23d659b0c17de557d9cfdf9554f297862888d0efb84d5e Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.265600 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.266777 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.269439 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-wsdp9" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.269484 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.271693 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.271837 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.286595 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.425797 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426091 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-deb2ef45-e088-4427-9e46-30566dd37609\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-deb2ef45-e088-4427-9e46-30566dd37609\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426136 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426179 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426294 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.426414 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9f99\" (UniqueName: \"kubernetes.io/projected/2857fca5-4863-4518-b69e-4ceeb0625fb5-kube-api-access-l9f99\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.527774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.527822 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.527853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.527901 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9f99\" (UniqueName: \"kubernetes.io/projected/2857fca5-4863-4518-b69e-4ceeb0625fb5-kube-api-access-l9f99\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.528076 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.528873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-deb2ef45-e088-4427-9e46-30566dd37609\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-deb2ef45-e088-4427-9e46-30566dd37609\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.528986 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.529012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.529110 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.529423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2857fca5-4863-4518-b69e-4ceeb0625fb5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.529477 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.529996 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2857fca5-4863-4518-b69e-4ceeb0625fb5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.532274 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.532314 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2857fca5-4863-4518-b69e-4ceeb0625fb5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.534498 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.534526 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-deb2ef45-e088-4427-9e46-30566dd37609\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-deb2ef45-e088-4427-9e46-30566dd37609\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3539ef04f9a543e98f264376b05a1bcc805469d5f72b5ae49dd0c18e912ca3a/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.545575 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9f99\" (UniqueName: \"kubernetes.io/projected/2857fca5-4863-4518-b69e-4ceeb0625fb5-kube-api-access-l9f99\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.558687 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-deb2ef45-e088-4427-9e46-30566dd37609\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-deb2ef45-e088-4427-9e46-30566dd37609\") pod \"openstack-cell1-galera-0\" (UID: \"2857fca5-4863-4518-b69e-4ceeb0625fb5\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.613605 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.729129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"340f7b33-817a-47bb-90f7-69a41144137d","Type":"ContainerStarted","Data":"35f12744a1e6869f89486494ed69189615d734c4f719b2a5ea7e3dddff833466"} Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.729201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"340f7b33-817a-47bb-90f7-69a41144137d","Type":"ContainerStarted","Data":"a591ac04d75acbdadc5f015adb8fc98a542cc6162bfe367ed1e8b73594146ddf"} Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.734215 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9","Type":"ContainerStarted","Data":"7231e985e8d6062f28c2ad0183d3897620e31ee085f185190e5a2b6044a99c3e"} Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.734298 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9","Type":"ContainerStarted","Data":"45cbdb8c1fc0ac941b23d659b0c17de557d9cfdf9554f297862888d0efb84d5e"} Dec 02 08:43:20 crc kubenswrapper[4895]: I1202 08:43:20.780829 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.7808119919999998 podStartE2EDuration="1.780811992s" podCreationTimestamp="2025-12-02 08:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:20.77689933 +0000 UTC m=+4811.947758943" watchObservedRunningTime="2025-12-02 08:43:20.780811992 +0000 UTC m=+4811.951671605" Dec 02 08:43:21 crc kubenswrapper[4895]: I1202 08:43:21.063078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:43:21 crc kubenswrapper[4895]: I1202 08:43:21.748112 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2857fca5-4863-4518-b69e-4ceeb0625fb5","Type":"ContainerStarted","Data":"d122f27f2a1d63ca202eec71c810be277a80ee1c945317628eeffbf6faf552bf"} Dec 02 08:43:21 crc kubenswrapper[4895]: I1202 08:43:21.748468 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2857fca5-4863-4518-b69e-4ceeb0625fb5","Type":"ContainerStarted","Data":"4d1cf64d6d7d614dce774b4dc2390c166d9459ce304ecf0e759c3216ab7f8b43"} Dec 02 08:43:21 crc kubenswrapper[4895]: I1202 08:43:21.748830 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 08:43:24 crc kubenswrapper[4895]: I1202 08:43:24.773417 4895 generic.go:334] "Generic (PLEG): container finished" podID="340f7b33-817a-47bb-90f7-69a41144137d" containerID="35f12744a1e6869f89486494ed69189615d734c4f719b2a5ea7e3dddff833466" exitCode=0 Dec 02 08:43:24 crc kubenswrapper[4895]: I1202 08:43:24.773517 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"340f7b33-817a-47bb-90f7-69a41144137d","Type":"ContainerDied","Data":"35f12744a1e6869f89486494ed69189615d734c4f719b2a5ea7e3dddff833466"} Dec 02 08:43:24 crc kubenswrapper[4895]: I1202 08:43:24.778252 4895 generic.go:334] "Generic (PLEG): container finished" podID="2857fca5-4863-4518-b69e-4ceeb0625fb5" containerID="d122f27f2a1d63ca202eec71c810be277a80ee1c945317628eeffbf6faf552bf" exitCode=0 Dec 02 08:43:24 crc kubenswrapper[4895]: I1202 08:43:24.778342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2857fca5-4863-4518-b69e-4ceeb0625fb5","Type":"ContainerDied","Data":"d122f27f2a1d63ca202eec71c810be277a80ee1c945317628eeffbf6faf552bf"} Dec 02 08:43:25 crc kubenswrapper[4895]: I1202 08:43:25.788226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2857fca5-4863-4518-b69e-4ceeb0625fb5","Type":"ContainerStarted","Data":"420511065fdeef819f1c9ff53ebc857d65389345d4dbb287c94de5e6c6c0132f"} Dec 02 08:43:25 crc kubenswrapper[4895]: I1202 08:43:25.790691 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"340f7b33-817a-47bb-90f7-69a41144137d","Type":"ContainerStarted","Data":"e07e38755151116c6cda82d18d9d0e9a00f1c02ce1ee5a66b8b687744b66815e"} Dec 02 08:43:25 crc kubenswrapper[4895]: I1202 08:43:25.816882 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=6.816863416 podStartE2EDuration="6.816863416s" podCreationTimestamp="2025-12-02 08:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:25.809716935 +0000 UTC m=+4816.980576568" watchObservedRunningTime="2025-12-02 08:43:25.816863416 +0000 UTC m=+4816.987723049" Dec 02 08:43:25 crc kubenswrapper[4895]: I1202 08:43:25.830667 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.830643425 podStartE2EDuration="8.830643425s" podCreationTimestamp="2025-12-02 08:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:25.82855488 +0000 UTC m=+4816.999414543" watchObservedRunningTime="2025-12-02 08:43:25.830643425 +0000 UTC m=+4817.001503048" Dec 02 08:43:26 crc kubenswrapper[4895]: I1202 08:43:26.338931 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:26 crc kubenswrapper[4895]: I1202 08:43:26.780887 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:43:26 crc kubenswrapper[4895]: I1202 08:43:26.840704 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:26 crc kubenswrapper[4895]: I1202 08:43:26.840973 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="dnsmasq-dns" containerID="cri-o://b0a958345e4af3f2d1e5d20fc711fbdf4183afd4ada1b51690aba8078f78c414" gracePeriod=10 Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.808396 4895 generic.go:334] "Generic (PLEG): container finished" podID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerID="b0a958345e4af3f2d1e5d20fc711fbdf4183afd4ada1b51690aba8078f78c414" exitCode=0 Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.808452 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" event={"ID":"5f400aae-225f-4349-ad0c-c4e2cfcaf833","Type":"ContainerDied","Data":"b0a958345e4af3f2d1e5d20fc711fbdf4183afd4ada1b51690aba8078f78c414"} Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.809116 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" event={"ID":"5f400aae-225f-4349-ad0c-c4e2cfcaf833","Type":"ContainerDied","Data":"762745d5bacdabac9d9cf2a17076818401865d2e247f6108e639a715dfcf73db"} Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.809135 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="762745d5bacdabac9d9cf2a17076818401865d2e247f6108e639a715dfcf73db" Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.834756 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.939992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njr2h\" (UniqueName: \"kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h\") pod \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.940260 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc\") pod \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.940344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config\") pod \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\" (UID: \"5f400aae-225f-4349-ad0c-c4e2cfcaf833\") " Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.945980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h" (OuterVolumeSpecName: "kube-api-access-njr2h") pod "5f400aae-225f-4349-ad0c-c4e2cfcaf833" (UID: "5f400aae-225f-4349-ad0c-c4e2cfcaf833"). InnerVolumeSpecName "kube-api-access-njr2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.986057 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config" (OuterVolumeSpecName: "config") pod "5f400aae-225f-4349-ad0c-c4e2cfcaf833" (UID: "5f400aae-225f-4349-ad0c-c4e2cfcaf833"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:43:27 crc kubenswrapper[4895]: I1202 08:43:27.988042 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f400aae-225f-4349-ad0c-c4e2cfcaf833" (UID: "5f400aae-225f-4349-ad0c-c4e2cfcaf833"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.042140 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njr2h\" (UniqueName: \"kubernetes.io/projected/5f400aae-225f-4349-ad0c-c4e2cfcaf833-kube-api-access-njr2h\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.042193 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.042207 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f400aae-225f-4349-ad0c-c4e2cfcaf833-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.815893 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-5nlvc" Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.855898 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:28 crc kubenswrapper[4895]: I1202 08:43:28.863112 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-5nlvc"] Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.151278 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" path="/var/lib/kubelet/pods/5f400aae-225f-4349-ad0c-c4e2cfcaf833/volumes" Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.296034 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.296100 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.429944 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.479533 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 08:43:29 crc kubenswrapper[4895]: I1202 08:43:29.896107 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 08:43:30 crc kubenswrapper[4895]: I1202 08:43:30.614186 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:30 crc kubenswrapper[4895]: I1202 08:43:30.614222 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:32 crc kubenswrapper[4895]: I1202 08:43:32.792297 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:32 crc kubenswrapper[4895]: I1202 08:43:32.865187 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 08:43:35 crc kubenswrapper[4895]: I1202 08:43:35.473864 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:43:35 crc kubenswrapper[4895]: I1202 08:43:35.474241 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:43:35 crc kubenswrapper[4895]: I1202 08:43:35.474284 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:43:35 crc kubenswrapper[4895]: I1202 08:43:35.475068 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:43:35 crc kubenswrapper[4895]: I1202 08:43:35.475125 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85" gracePeriod=600 Dec 02 08:43:36 crc kubenswrapper[4895]: I1202 08:43:36.209335 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85" exitCode=0 Dec 02 08:43:36 crc kubenswrapper[4895]: I1202 08:43:36.209406 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85"} Dec 02 08:43:36 crc kubenswrapper[4895]: I1202 08:43:36.209882 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875"} Dec 02 08:43:36 crc kubenswrapper[4895]: I1202 08:43:36.209916 4895 scope.go:117] "RemoveContainer" containerID="69a4bb5ee156ed278d803f64131ce7c25664438c59334774293c1c389adc3c0a" Dec 02 08:43:52 crc kubenswrapper[4895]: I1202 08:43:52.409934 4895 generic.go:334] "Generic (PLEG): container finished" podID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerID="faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c" exitCode=0 Dec 02 08:43:52 crc kubenswrapper[4895]: I1202 08:43:52.410007 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerDied","Data":"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c"} Dec 02 08:43:52 crc kubenswrapper[4895]: I1202 08:43:52.412265 4895 generic.go:334] "Generic (PLEG): container finished" podID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerID="6674e75dfd35f2994edea8ee60804b1e48a883232d70c5c9855dfbf5610bfa25" exitCode=0 Dec 02 08:43:52 crc kubenswrapper[4895]: I1202 08:43:52.412299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerDied","Data":"6674e75dfd35f2994edea8ee60804b1e48a883232d70c5c9855dfbf5610bfa25"} Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.420603 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerStarted","Data":"a061b50c3f6e797dcbe189fb9abbe41d639a768e32dadbd12a1902a012645b95"} Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.421137 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.423220 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerStarted","Data":"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b"} Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.424335 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.457260 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.457243841 podStartE2EDuration="37.457243841s" podCreationTimestamp="2025-12-02 08:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:53.449874312 +0000 UTC m=+4844.620733945" watchObservedRunningTime="2025-12-02 08:43:53.457243841 +0000 UTC m=+4844.628103454" Dec 02 08:43:53 crc kubenswrapper[4895]: I1202 08:43:53.484535 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.484518389 podStartE2EDuration="37.484518389s" podCreationTimestamp="2025-12-02 08:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:43:53.480909037 +0000 UTC m=+4844.651768650" watchObservedRunningTime="2025-12-02 08:43:53.484518389 +0000 UTC m=+4844.655378002" Dec 02 08:44:07 crc kubenswrapper[4895]: I1202 08:44:07.535916 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 08:44:07 crc kubenswrapper[4895]: I1202 08:44:07.938381 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.207814 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:44:13 crc kubenswrapper[4895]: E1202 08:44:13.208821 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="init" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.208843 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="init" Dec 02 08:44:13 crc kubenswrapper[4895]: E1202 08:44:13.208870 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="dnsmasq-dns" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.208878 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="dnsmasq-dns" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.209103 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f400aae-225f-4349-ad0c-c4e2cfcaf833" containerName="dnsmasq-dns" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.210174 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.217026 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.401569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq5hl\" (UniqueName: \"kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.401669 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.401730 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.502848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq5hl\" (UniqueName: \"kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.502943 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.502968 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.503978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.503987 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.521543 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq5hl\" (UniqueName: \"kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl\") pod \"dnsmasq-dns-5b7946d7b9-rffv6\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.578553 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.878126 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:44:13 crc kubenswrapper[4895]: I1202 08:44:13.975119 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:14 crc kubenswrapper[4895]: I1202 08:44:14.413148 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:14 crc kubenswrapper[4895]: I1202 08:44:14.597038 4895 generic.go:334] "Generic (PLEG): container finished" podID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerID="deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9" exitCode=0 Dec 02 08:44:14 crc kubenswrapper[4895]: I1202 08:44:14.597100 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" event={"ID":"b8201768-d804-4993-bf5a-8e81c0be77d0","Type":"ContainerDied","Data":"deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9"} Dec 02 08:44:14 crc kubenswrapper[4895]: I1202 08:44:14.597158 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" event={"ID":"b8201768-d804-4993-bf5a-8e81c0be77d0","Type":"ContainerStarted","Data":"b750817bf6c9b95d40b25b4247570e66d947b40aa32e73c02fa9f96b5f62c1c7"} Dec 02 08:44:15 crc kubenswrapper[4895]: I1202 08:44:15.606462 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" event={"ID":"b8201768-d804-4993-bf5a-8e81c0be77d0","Type":"ContainerStarted","Data":"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767"} Dec 02 08:44:15 crc kubenswrapper[4895]: I1202 08:44:15.606832 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:15 crc kubenswrapper[4895]: I1202 08:44:15.628780 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" podStartSLOduration=2.6287625180000003 podStartE2EDuration="2.628762518s" podCreationTimestamp="2025-12-02 08:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:44:15.625108515 +0000 UTC m=+4866.795968128" watchObservedRunningTime="2025-12-02 08:44:15.628762518 +0000 UTC m=+4866.799622131" Dec 02 08:44:15 crc kubenswrapper[4895]: I1202 08:44:15.850332 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="rabbitmq" containerID="cri-o://c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b" gracePeriod=604799 Dec 02 08:44:16 crc kubenswrapper[4895]: I1202 08:44:16.246020 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="rabbitmq" containerID="cri-o://a061b50c3f6e797dcbe189fb9abbe41d639a768e32dadbd12a1902a012645b95" gracePeriod=604799 Dec 02 08:44:17 crc kubenswrapper[4895]: I1202 08:44:17.534797 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Dec 02 08:44:17 crc kubenswrapper[4895]: I1202 08:44:17.936714 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.447804 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.567787 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.567844 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.567874 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.567923 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.567983 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9lfp\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.568001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.568064 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.568122 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.568256 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"7665a2f5-45bb-4972-9631-456ed63a9da2\" (UID: \"7665a2f5-45bb-4972-9631-456ed63a9da2\") " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.569144 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.569249 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.569515 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.575562 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.584715 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171" (OuterVolumeSpecName: "persistence") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.588993 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info" (OuterVolumeSpecName: "pod-info") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.589368 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp" (OuterVolumeSpecName: "kube-api-access-v9lfp") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "kube-api-access-v9lfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.599480 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf" (OuterVolumeSpecName: "server-conf") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.678279 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680206 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") on node \"crc\" " Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680230 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7665a2f5-45bb-4972-9631-456ed63a9da2-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680303 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680324 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7665a2f5-45bb-4972-9631-456ed63a9da2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680336 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7665a2f5-45bb-4972-9631-456ed63a9da2-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680455 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9lfp\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-kube-api-access-v9lfp\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.680471 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.693046 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7665a2f5-45bb-4972-9631-456ed63a9da2" (UID: "7665a2f5-45bb-4972-9631-456ed63a9da2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.693825 4895 generic.go:334] "Generic (PLEG): container finished" podID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerID="a061b50c3f6e797dcbe189fb9abbe41d639a768e32dadbd12a1902a012645b95" exitCode=0 Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.693953 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerDied","Data":"a061b50c3f6e797dcbe189fb9abbe41d639a768e32dadbd12a1902a012645b95"} Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.697773 4895 generic.go:334] "Generic (PLEG): container finished" podID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerID="c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b" exitCode=0 Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.697826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerDied","Data":"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b"} Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.697857 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7665a2f5-45bb-4972-9631-456ed63a9da2","Type":"ContainerDied","Data":"32b4fff28c31289beee34985ce94905335cf956fd78d6baf03487b88c87187e8"} Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.697879 4895 scope.go:117] "RemoveContainer" containerID="c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.697893 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.708084 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.708303 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171") on node "crc" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.730956 4895 scope.go:117] "RemoveContainer" containerID="faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.764245 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.771072 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.772960 4895 scope.go:117] "RemoveContainer" containerID="c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b" Dec 02 08:44:22 crc kubenswrapper[4895]: E1202 08:44:22.776303 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b\": container with ID starting with c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b not found: ID does not exist" containerID="c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.776335 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b"} err="failed to get container status \"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b\": rpc error: code = NotFound desc = could not find container \"c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b\": container with ID starting with c0c9677beed3d2c79a872bedf8489cea4ec46b9afa1fcc188b40cff9ebfebe2b not found: ID does not exist" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.776358 4895 scope.go:117] "RemoveContainer" containerID="faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c" Dec 02 08:44:22 crc kubenswrapper[4895]: E1202 08:44:22.778924 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c\": container with ID starting with faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c not found: ID does not exist" containerID="faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.778982 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c"} err="failed to get container status \"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c\": rpc error: code = NotFound desc = could not find container \"faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c\": container with ID starting with faeb2e140d036d7094647159d5e7ace7ba6dcf7aa1fd2e79dc0d6109ed84a54c not found: ID does not exist" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.782538 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7665a2f5-45bb-4972-9631-456ed63a9da2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.782594 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.786652 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:22 crc kubenswrapper[4895]: E1202 08:44:22.787106 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="rabbitmq" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.787120 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="rabbitmq" Dec 02 08:44:22 crc kubenswrapper[4895]: E1202 08:44:22.787160 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="setup-container" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.787167 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="setup-container" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.787432 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" containerName="rabbitmq" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.788685 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.806716 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.807012 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.807265 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7cgr2" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.807381 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.807962 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.836364 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a551304-11d9-432c-bd8d-074239ed81c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984402 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984455 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984525 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984545 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984582 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pltr2\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-kube-api-access-pltr2\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984596 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:22 crc kubenswrapper[4895]: I1202 08:44:22.984617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a551304-11d9-432c-bd8d-074239ed81c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.007663 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a551304-11d9-432c-bd8d-074239ed81c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085879 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085908 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085930 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085953 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.085991 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pltr2\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-kube-api-access-pltr2\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.086008 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.086033 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a551304-11d9-432c-bd8d-074239ed81c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.086476 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.086781 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.087701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.091303 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1a551304-11d9-432c-bd8d-074239ed81c9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.091930 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1a551304-11d9-432c-bd8d-074239ed81c9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.092141 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.092169 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bc43ae5c44dd4ed501f2535978d3baf80a461f20cdd11d3425cd6ff237251cb7/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.109757 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.112311 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pltr2\" (UniqueName: \"kubernetes.io/projected/1a551304-11d9-432c-bd8d-074239ed81c9-kube-api-access-pltr2\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.113160 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1a551304-11d9-432c-bd8d-074239ed81c9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.149493 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7665a2f5-45bb-4972-9631-456ed63a9da2" path="/var/lib/kubelet/pods/7665a2f5-45bb-4972-9631-456ed63a9da2/volumes" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.154418 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7eb78b69-cce8-463f-8ce0-16375b1ef171\") pod \"rabbitmq-server-0\" (UID: \"1a551304-11d9-432c-bd8d-074239ed81c9\") " pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.165461 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192054 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckvx2\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192176 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192257 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192323 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192372 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192480 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192527 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192574 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins\") pod \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\" (UID: \"c2d3da5e-ef77-4e6c-8040-ef3000794d29\") " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192871 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.192993 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.193140 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.193781 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.193800 4895 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.193808 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.196988 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2" (OuterVolumeSpecName: "kube-api-access-ckvx2") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "kube-api-access-ckvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.201704 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info" (OuterVolumeSpecName: "pod-info") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.201904 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.215943 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562" (OuterVolumeSpecName: "persistence") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "pvc-918701b9-c5a0-4ea9-8711-8ba343e88562". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.238385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf" (OuterVolumeSpecName: "server-conf") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.288462 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c2d3da5e-ef77-4e6c-8040-ef3000794d29" (UID: "c2d3da5e-ef77-4e6c-8040-ef3000794d29"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295700 4895 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2d3da5e-ef77-4e6c-8040-ef3000794d29-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295725 4895 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295736 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckvx2\" (UniqueName: \"kubernetes.io/projected/c2d3da5e-ef77-4e6c-8040-ef3000794d29-kube-api-access-ckvx2\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295766 4895 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2d3da5e-ef77-4e6c-8040-ef3000794d29-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295775 4895 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2d3da5e-ef77-4e6c-8040-ef3000794d29-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.295802 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") on node \"crc\" " Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.315888 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.316200 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-918701b9-c5a0-4ea9-8711-8ba343e88562" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562") on node "crc" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.397147 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.403592 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.579972 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.634852 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.635142 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="dnsmasq-dns" containerID="cri-o://d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e" gracePeriod=10 Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.708480 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c2d3da5e-ef77-4e6c-8040-ef3000794d29","Type":"ContainerDied","Data":"56990082aac0476a0e8d5ae7604014c1cfe2b5cd5ba762739a2ec9e517720507"} Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.708562 4895 scope.go:117] "RemoveContainer" containerID="a061b50c3f6e797dcbe189fb9abbe41d639a768e32dadbd12a1902a012645b95" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.708884 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.720119 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a551304-11d9-432c-bd8d-074239ed81c9","Type":"ContainerStarted","Data":"bf2e5a4e7d035c193acfe8c227aac56704ad9e79860b4c33651114309701b582"} Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.740305 4895 scope.go:117] "RemoveContainer" containerID="6674e75dfd35f2994edea8ee60804b1e48a883232d70c5c9855dfbf5610bfa25" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.756102 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.774859 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.794394 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:23 crc kubenswrapper[4895]: E1202 08:44:23.794974 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="rabbitmq" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.795004 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="rabbitmq" Dec 02 08:44:23 crc kubenswrapper[4895]: E1202 08:44:23.795041 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="setup-container" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.795049 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="setup-container" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.795236 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" containerName="rabbitmq" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.796598 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.802135 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mhv2p" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.802354 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.802644 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.803003 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.814692 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.826856 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909356 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909408 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909452 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909536 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc945ba-c10f-4460-a3ed-e075da154b6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc945ba-c10f-4460-a3ed-e075da154b6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909591 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4fzj\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-kube-api-access-n4fzj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909615 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:23 crc kubenswrapper[4895]: I1202 08:44:23.909694 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010548 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010659 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc945ba-c10f-4460-a3ed-e075da154b6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010713 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc945ba-c10f-4460-a3ed-e075da154b6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010762 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4fzj\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-kube-api-access-n4fzj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010791 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010825 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010893 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.010922 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.011494 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.011534 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.015973 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.016335 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4fc945ba-c10f-4460-a3ed-e075da154b6a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.018487 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4fc945ba-c10f-4460-a3ed-e075da154b6a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.018531 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4fc945ba-c10f-4460-a3ed-e075da154b6a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.018969 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.028570 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.028626 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/11bea3178402b74d8565de09d958d81983ae748ac7ef3168967b3521a42ba14e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.033058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4fzj\" (UniqueName: \"kubernetes.io/projected/4fc945ba-c10f-4460-a3ed-e075da154b6a-kube-api-access-n4fzj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.063445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-918701b9-c5a0-4ea9-8711-8ba343e88562\") pod \"rabbitmq-cell1-server-0\" (UID: \"4fc945ba-c10f-4460-a3ed-e075da154b6a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.131636 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.376204 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.518374 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc\") pod \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.518486 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt\") pod \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.518777 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config\") pod \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\" (UID: \"06719ca1-7571-4a00-ab0e-cbbf89f793e3\") " Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.548173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt" (OuterVolumeSpecName: "kube-api-access-x7bvt") pod "06719ca1-7571-4a00-ab0e-cbbf89f793e3" (UID: "06719ca1-7571-4a00-ab0e-cbbf89f793e3"). InnerVolumeSpecName "kube-api-access-x7bvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.576777 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config" (OuterVolumeSpecName: "config") pod "06719ca1-7571-4a00-ab0e-cbbf89f793e3" (UID: "06719ca1-7571-4a00-ab0e-cbbf89f793e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.577200 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06719ca1-7571-4a00-ab0e-cbbf89f793e3" (UID: "06719ca1-7571-4a00-ab0e-cbbf89f793e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.620813 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7bvt\" (UniqueName: \"kubernetes.io/projected/06719ca1-7571-4a00-ab0e-cbbf89f793e3-kube-api-access-x7bvt\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.620858 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.620867 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06719ca1-7571-4a00-ab0e-cbbf89f793e3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.641244 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.736437 4895 generic.go:334] "Generic (PLEG): container finished" podID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerID="d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e" exitCode=0 Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.736546 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.736546 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" event={"ID":"06719ca1-7571-4a00-ab0e-cbbf89f793e3","Type":"ContainerDied","Data":"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e"} Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.737037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-vcggd" event={"ID":"06719ca1-7571-4a00-ab0e-cbbf89f793e3","Type":"ContainerDied","Data":"f4415a5b8ca2530ba4ae63b5e89e9a89b23307bc59627fef7c3fab810dc47355"} Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.737065 4895 scope.go:117] "RemoveContainer" containerID="d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.739153 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc945ba-c10f-4460-a3ed-e075da154b6a","Type":"ContainerStarted","Data":"95fb69e3dd8748c09e111047ba58bba0cd4eaa491ca1dcfb47e3c12f513798a7"} Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.765479 4895 scope.go:117] "RemoveContainer" containerID="df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.796730 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.802695 4895 scope.go:117] "RemoveContainer" containerID="d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.803200 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-vcggd"] Dec 02 08:44:24 crc kubenswrapper[4895]: E1202 08:44:24.804264 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e\": container with ID starting with d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e not found: ID does not exist" containerID="d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.804306 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e"} err="failed to get container status \"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e\": rpc error: code = NotFound desc = could not find container \"d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e\": container with ID starting with d005b72044bc1e2c07a10573c98b4fb475545b20a5b46e8e4aad77182f1aa04e not found: ID does not exist" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.804343 4895 scope.go:117] "RemoveContainer" containerID="df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7" Dec 02 08:44:24 crc kubenswrapper[4895]: E1202 08:44:24.805067 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7\": container with ID starting with df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7 not found: ID does not exist" containerID="df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7" Dec 02 08:44:24 crc kubenswrapper[4895]: I1202 08:44:24.805128 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7"} err="failed to get container status \"df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7\": rpc error: code = NotFound desc = could not find container \"df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7\": container with ID starting with df69ebf56b1ca681b69bc5cd2cc94bb3f28527ad72d69a26f1285ef569fc00e7 not found: ID does not exist" Dec 02 08:44:25 crc kubenswrapper[4895]: I1202 08:44:25.153141 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" path="/var/lib/kubelet/pods/06719ca1-7571-4a00-ab0e-cbbf89f793e3/volumes" Dec 02 08:44:25 crc kubenswrapper[4895]: I1202 08:44:25.153927 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d3da5e-ef77-4e6c-8040-ef3000794d29" path="/var/lib/kubelet/pods/c2d3da5e-ef77-4e6c-8040-ef3000794d29/volumes" Dec 02 08:44:25 crc kubenswrapper[4895]: I1202 08:44:25.783094 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a551304-11d9-432c-bd8d-074239ed81c9","Type":"ContainerStarted","Data":"26a937a892a3387319a9f500dda03008bdb9f31bd5ca232d7b4207e26662092b"} Dec 02 08:44:26 crc kubenswrapper[4895]: I1202 08:44:26.792299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc945ba-c10f-4460-a3ed-e075da154b6a","Type":"ContainerStarted","Data":"16433bf2385e84d5df65a03bda56159546e7f6520ce2139d510766ddf897be16"} Dec 02 08:44:59 crc kubenswrapper[4895]: I1202 08:44:59.090902 4895 generic.go:334] "Generic (PLEG): container finished" podID="4fc945ba-c10f-4460-a3ed-e075da154b6a" containerID="16433bf2385e84d5df65a03bda56159546e7f6520ce2139d510766ddf897be16" exitCode=0 Dec 02 08:44:59 crc kubenswrapper[4895]: I1202 08:44:59.090997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc945ba-c10f-4460-a3ed-e075da154b6a","Type":"ContainerDied","Data":"16433bf2385e84d5df65a03bda56159546e7f6520ce2139d510766ddf897be16"} Dec 02 08:44:59 crc kubenswrapper[4895]: I1202 08:44:59.093770 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a551304-11d9-432c-bd8d-074239ed81c9" containerID="26a937a892a3387319a9f500dda03008bdb9f31bd5ca232d7b4207e26662092b" exitCode=0 Dec 02 08:44:59 crc kubenswrapper[4895]: I1202 08:44:59.093862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a551304-11d9-432c-bd8d-074239ed81c9","Type":"ContainerDied","Data":"26a937a892a3387319a9f500dda03008bdb9f31bd5ca232d7b4207e26662092b"} Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.104525 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4fc945ba-c10f-4460-a3ed-e075da154b6a","Type":"ContainerStarted","Data":"69203c325fb5d1f3d33e49a2e852d10d64b7cd09e973779a9a3dcb745a7f0656"} Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.104825 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.108669 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1a551304-11d9-432c-bd8d-074239ed81c9","Type":"ContainerStarted","Data":"2e231212ee997ace16e093ec2b68ae36068c2ebf55b86fa3e028dee17d8f6177"} Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.109262 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.180879 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.180856336 podStartE2EDuration="37.180856336s" podCreationTimestamp="2025-12-02 08:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:45:00.1735772 +0000 UTC m=+4911.344436833" watchObservedRunningTime="2025-12-02 08:45:00.180856336 +0000 UTC m=+4911.351715969" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.216676 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx"] Dec 02 08:45:00 crc kubenswrapper[4895]: E1202 08:45:00.218335 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="dnsmasq-dns" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.218427 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="dnsmasq-dns" Dec 02 08:45:00 crc kubenswrapper[4895]: E1202 08:45:00.218555 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="init" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.218625 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="init" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.219045 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="06719ca1-7571-4a00-ab0e-cbbf89f793e3" containerName="dnsmasq-dns" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.220593 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.228481 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.230140 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.267106 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx"] Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.280968 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.280941339 podStartE2EDuration="38.280941339s" podCreationTimestamp="2025-12-02 08:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:45:00.229487288 +0000 UTC m=+4911.400346911" watchObservedRunningTime="2025-12-02 08:45:00.280941339 +0000 UTC m=+4911.451800972" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.459726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkdq\" (UniqueName: \"kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.459980 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.460109 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.561504 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkdq\" (UniqueName: \"kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.561593 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.561615 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.562861 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.566730 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.583966 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkdq\" (UniqueName: \"kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq\") pod \"collect-profiles-29411085-v5dfx\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:00 crc kubenswrapper[4895]: I1202 08:45:00.873068 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:01 crc kubenswrapper[4895]: I1202 08:45:01.370422 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx"] Dec 02 08:45:02 crc kubenswrapper[4895]: I1202 08:45:02.124187 4895 generic.go:334] "Generic (PLEG): container finished" podID="6858333d-2201-4f94-a119-c92c9dbf7cce" containerID="cd9c72188aebc52ab1dfe2e9f41694f764e98994862c2ae178b4f1b0280854fb" exitCode=0 Dec 02 08:45:02 crc kubenswrapper[4895]: I1202 08:45:02.124299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" event={"ID":"6858333d-2201-4f94-a119-c92c9dbf7cce","Type":"ContainerDied","Data":"cd9c72188aebc52ab1dfe2e9f41694f764e98994862c2ae178b4f1b0280854fb"} Dec 02 08:45:02 crc kubenswrapper[4895]: I1202 08:45:02.124496 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" event={"ID":"6858333d-2201-4f94-a119-c92c9dbf7cce","Type":"ContainerStarted","Data":"0a391939274245d9b96d9f323473273b7c2d4fc3c0229dcc08576aa76bfd4d07"} Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.437112 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.509093 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume\") pod \"6858333d-2201-4f94-a119-c92c9dbf7cce\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.509349 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume\") pod \"6858333d-2201-4f94-a119-c92c9dbf7cce\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.509406 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpkdq\" (UniqueName: \"kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq\") pod \"6858333d-2201-4f94-a119-c92c9dbf7cce\" (UID: \"6858333d-2201-4f94-a119-c92c9dbf7cce\") " Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.510018 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume" (OuterVolumeSpecName: "config-volume") pod "6858333d-2201-4f94-a119-c92c9dbf7cce" (UID: "6858333d-2201-4f94-a119-c92c9dbf7cce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.526964 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6858333d-2201-4f94-a119-c92c9dbf7cce" (UID: "6858333d-2201-4f94-a119-c92c9dbf7cce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.527025 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq" (OuterVolumeSpecName: "kube-api-access-fpkdq") pod "6858333d-2201-4f94-a119-c92c9dbf7cce" (UID: "6858333d-2201-4f94-a119-c92c9dbf7cce"). InnerVolumeSpecName "kube-api-access-fpkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.611064 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6858333d-2201-4f94-a119-c92c9dbf7cce-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.611109 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpkdq\" (UniqueName: \"kubernetes.io/projected/6858333d-2201-4f94-a119-c92c9dbf7cce-kube-api-access-fpkdq\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:03 crc kubenswrapper[4895]: I1202 08:45:03.611121 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6858333d-2201-4f94-a119-c92c9dbf7cce-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:04 crc kubenswrapper[4895]: I1202 08:45:04.143184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" event={"ID":"6858333d-2201-4f94-a119-c92c9dbf7cce","Type":"ContainerDied","Data":"0a391939274245d9b96d9f323473273b7c2d4fc3c0229dcc08576aa76bfd4d07"} Dec 02 08:45:04 crc kubenswrapper[4895]: I1202 08:45:04.143493 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a391939274245d9b96d9f323473273b7c2d4fc3c0229dcc08576aa76bfd4d07" Dec 02 08:45:04 crc kubenswrapper[4895]: I1202 08:45:04.143258 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx" Dec 02 08:45:04 crc kubenswrapper[4895]: I1202 08:45:04.516373 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv"] Dec 02 08:45:04 crc kubenswrapper[4895]: I1202 08:45:04.522306 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-qskgv"] Dec 02 08:45:05 crc kubenswrapper[4895]: I1202 08:45:05.150422 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fb87c2-8e73-495e-afdf-a2886910d986" path="/var/lib/kubelet/pods/c5fb87c2-8e73-495e-afdf-a2886910d986/volumes" Dec 02 08:45:13 crc kubenswrapper[4895]: I1202 08:45:13.168893 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 08:45:14 crc kubenswrapper[4895]: I1202 08:45:14.135796 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.922719 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 02 08:45:24 crc kubenswrapper[4895]: E1202 08:45:24.923670 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6858333d-2201-4f94-a119-c92c9dbf7cce" containerName="collect-profiles" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.923687 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6858333d-2201-4f94-a119-c92c9dbf7cce" containerName="collect-profiles" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.923914 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6858333d-2201-4f94-a119-c92c9dbf7cce" containerName="collect-profiles" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.924571 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.926855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-djw6d" Dec 02 08:45:24 crc kubenswrapper[4895]: I1202 08:45:24.928664 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 02 08:45:25 crc kubenswrapper[4895]: I1202 08:45:25.055714 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7sx7\" (UniqueName: \"kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7\") pod \"mariadb-client-1-default\" (UID: \"592917fd-3d88-4847-af54-c09b7ff35190\") " pod="openstack/mariadb-client-1-default" Dec 02 08:45:25 crc kubenswrapper[4895]: I1202 08:45:25.157521 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7sx7\" (UniqueName: \"kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7\") pod \"mariadb-client-1-default\" (UID: \"592917fd-3d88-4847-af54-c09b7ff35190\") " pod="openstack/mariadb-client-1-default" Dec 02 08:45:25 crc kubenswrapper[4895]: I1202 08:45:25.182654 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7sx7\" (UniqueName: \"kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7\") pod \"mariadb-client-1-default\" (UID: \"592917fd-3d88-4847-af54-c09b7ff35190\") " pod="openstack/mariadb-client-1-default" Dec 02 08:45:25 crc kubenswrapper[4895]: I1202 08:45:25.241488 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 02 08:45:25 crc kubenswrapper[4895]: I1202 08:45:25.720313 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 02 08:45:25 crc kubenswrapper[4895]: W1202 08:45:25.727875 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod592917fd_3d88_4847_af54_c09b7ff35190.slice/crio-92209f1eb4374380f270dc3e74e2be1b471e3842fcf35441a8ce739b2485ed4c WatchSource:0}: Error finding container 92209f1eb4374380f270dc3e74e2be1b471e3842fcf35441a8ce739b2485ed4c: Status 404 returned error can't find the container with id 92209f1eb4374380f270dc3e74e2be1b471e3842fcf35441a8ce739b2485ed4c Dec 02 08:45:26 crc kubenswrapper[4895]: I1202 08:45:26.355273 4895 generic.go:334] "Generic (PLEG): container finished" podID="592917fd-3d88-4847-af54-c09b7ff35190" containerID="d1455e109e3f0574aded0226a1b09f401df9d32f7c0ba525cb55ef448fb81738" exitCode=0 Dec 02 08:45:26 crc kubenswrapper[4895]: I1202 08:45:26.355323 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"592917fd-3d88-4847-af54-c09b7ff35190","Type":"ContainerDied","Data":"d1455e109e3f0574aded0226a1b09f401df9d32f7c0ba525cb55ef448fb81738"} Dec 02 08:45:26 crc kubenswrapper[4895]: I1202 08:45:26.356374 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"592917fd-3d88-4847-af54-c09b7ff35190","Type":"ContainerStarted","Data":"92209f1eb4374380f270dc3e74e2be1b471e3842fcf35441a8ce739b2485ed4c"} Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.685779 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.710931 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_592917fd-3d88-4847-af54-c09b7ff35190/mariadb-client-1-default/0.log" Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.736890 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.742102 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.796135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7sx7\" (UniqueName: \"kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7\") pod \"592917fd-3d88-4847-af54-c09b7ff35190\" (UID: \"592917fd-3d88-4847-af54-c09b7ff35190\") " Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.802548 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7" (OuterVolumeSpecName: "kube-api-access-g7sx7") pod "592917fd-3d88-4847-af54-c09b7ff35190" (UID: "592917fd-3d88-4847-af54-c09b7ff35190"). InnerVolumeSpecName "kube-api-access-g7sx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:27 crc kubenswrapper[4895]: I1202 08:45:27.897666 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7sx7\" (UniqueName: \"kubernetes.io/projected/592917fd-3d88-4847-af54-c09b7ff35190-kube-api-access-g7sx7\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.140136 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 02 08:45:28 crc kubenswrapper[4895]: E1202 08:45:28.140707 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592917fd-3d88-4847-af54-c09b7ff35190" containerName="mariadb-client-1-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.140762 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="592917fd-3d88-4847-af54-c09b7ff35190" containerName="mariadb-client-1-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.140970 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="592917fd-3d88-4847-af54-c09b7ff35190" containerName="mariadb-client-1-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.141645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.156156 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.303518 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5\") pod \"mariadb-client-2-default\" (UID: \"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d\") " pod="openstack/mariadb-client-2-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.375259 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92209f1eb4374380f270dc3e74e2be1b471e3842fcf35441a8ce739b2485ed4c" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.375325 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.406181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5\") pod \"mariadb-client-2-default\" (UID: \"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d\") " pod="openstack/mariadb-client-2-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.427379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5\") pod \"mariadb-client-2-default\" (UID: \"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d\") " pod="openstack/mariadb-client-2-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.457979 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.771301 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 02 08:45:28 crc kubenswrapper[4895]: I1202 08:45:28.869666 4895 scope.go:117] "RemoveContainer" containerID="fb2380bbefb8ea9c516cbef089e781cf7406c97d9c8b1f40d62d00c034d7b125" Dec 02 08:45:29 crc kubenswrapper[4895]: I1202 08:45:29.152863 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592917fd-3d88-4847-af54-c09b7ff35190" path="/var/lib/kubelet/pods/592917fd-3d88-4847-af54-c09b7ff35190/volumes" Dec 02 08:45:29 crc kubenswrapper[4895]: I1202 08:45:29.384760 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d","Type":"ContainerStarted","Data":"314a5bb4aa677fc60fe900ff1f1544794ba3896d8e03f593bec50c68bcb84802"} Dec 02 08:45:29 crc kubenswrapper[4895]: I1202 08:45:29.384807 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d","Type":"ContainerStarted","Data":"a0b6cef3cf29fe8555ca1ca8b42d5bf95254a0047fbdbb52e980aa77bb284e4e"} Dec 02 08:45:29 crc kubenswrapper[4895]: I1202 08:45:29.401376 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.401324742 podStartE2EDuration="1.401324742s" podCreationTimestamp="2025-12-02 08:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:45:29.399563237 +0000 UTC m=+4940.570422850" watchObservedRunningTime="2025-12-02 08:45:29.401324742 +0000 UTC m=+4940.572184365" Dec 02 08:45:30 crc kubenswrapper[4895]: I1202 08:45:30.398106 4895 generic.go:334] "Generic (PLEG): container finished" podID="6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" containerID="314a5bb4aa677fc60fe900ff1f1544794ba3896d8e03f593bec50c68bcb84802" exitCode=1 Dec 02 08:45:30 crc kubenswrapper[4895]: I1202 08:45:30.398226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d","Type":"ContainerDied","Data":"314a5bb4aa677fc60fe900ff1f1544794ba3896d8e03f593bec50c68bcb84802"} Dec 02 08:45:31 crc kubenswrapper[4895]: I1202 08:45:31.812050 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 02 08:45:31 crc kubenswrapper[4895]: I1202 08:45:31.856850 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 02 08:45:31 crc kubenswrapper[4895]: I1202 08:45:31.862007 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 02 08:45:31 crc kubenswrapper[4895]: I1202 08:45:31.966496 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5\") pod \"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d\" (UID: \"6d9b6de8-1d2b-401d-b3ce-598000dd3b7d\") " Dec 02 08:45:31 crc kubenswrapper[4895]: I1202 08:45:31.973963 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5" (OuterVolumeSpecName: "kube-api-access-xvrg5") pod "6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" (UID: "6d9b6de8-1d2b-401d-b3ce-598000dd3b7d"). InnerVolumeSpecName "kube-api-access-xvrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.068611 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvrg5\" (UniqueName: \"kubernetes.io/projected/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d-kube-api-access-xvrg5\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.315958 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 02 08:45:32 crc kubenswrapper[4895]: E1202 08:45:32.319445 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" containerName="mariadb-client-2-default" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.319675 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" containerName="mariadb-client-2-default" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.320157 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" containerName="mariadb-client-2-default" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.321157 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.332405 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.416927 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b6cef3cf29fe8555ca1ca8b42d5bf95254a0047fbdbb52e980aa77bb284e4e" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.416960 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.475356 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p6kc\" (UniqueName: \"kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc\") pod \"mariadb-client-1\" (UID: \"70994d99-01f3-4875-ac56-63e2f31a8266\") " pod="openstack/mariadb-client-1" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.576809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p6kc\" (UniqueName: \"kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc\") pod \"mariadb-client-1\" (UID: \"70994d99-01f3-4875-ac56-63e2f31a8266\") " pod="openstack/mariadb-client-1" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.593297 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p6kc\" (UniqueName: \"kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc\") pod \"mariadb-client-1\" (UID: \"70994d99-01f3-4875-ac56-63e2f31a8266\") " pod="openstack/mariadb-client-1" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.640553 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 02 08:45:32 crc kubenswrapper[4895]: I1202 08:45:32.978047 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 02 08:45:33 crc kubenswrapper[4895]: I1202 08:45:33.171478 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9b6de8-1d2b-401d-b3ce-598000dd3b7d" path="/var/lib/kubelet/pods/6d9b6de8-1d2b-401d-b3ce-598000dd3b7d/volumes" Dec 02 08:45:33 crc kubenswrapper[4895]: I1202 08:45:33.427015 4895 generic.go:334] "Generic (PLEG): container finished" podID="70994d99-01f3-4875-ac56-63e2f31a8266" containerID="e21a8c652bb01fe819ac06cfd20fdc2c4c844373697f522b561fac416c9aa8ab" exitCode=0 Dec 02 08:45:33 crc kubenswrapper[4895]: I1202 08:45:33.427070 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"70994d99-01f3-4875-ac56-63e2f31a8266","Type":"ContainerDied","Data":"e21a8c652bb01fe819ac06cfd20fdc2c4c844373697f522b561fac416c9aa8ab"} Dec 02 08:45:33 crc kubenswrapper[4895]: I1202 08:45:33.427479 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"70994d99-01f3-4875-ac56-63e2f31a8266","Type":"ContainerStarted","Data":"0ad9c841ef639b3bd1507acc8e20d954966d0eebf0d2a0ec5fa31ad9761b3fdb"} Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.784623 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.801459 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_70994d99-01f3-4875-ac56-63e2f31a8266/mariadb-client-1/0.log" Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.831272 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.840031 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.919373 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p6kc\" (UniqueName: \"kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc\") pod \"70994d99-01f3-4875-ac56-63e2f31a8266\" (UID: \"70994d99-01f3-4875-ac56-63e2f31a8266\") " Dec 02 08:45:34 crc kubenswrapper[4895]: I1202 08:45:34.933082 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc" (OuterVolumeSpecName: "kube-api-access-9p6kc") pod "70994d99-01f3-4875-ac56-63e2f31a8266" (UID: "70994d99-01f3-4875-ac56-63e2f31a8266"). InnerVolumeSpecName "kube-api-access-9p6kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.020860 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p6kc\" (UniqueName: \"kubernetes.io/projected/70994d99-01f3-4875-ac56-63e2f31a8266-kube-api-access-9p6kc\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.150094 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70994d99-01f3-4875-ac56-63e2f31a8266" path="/var/lib/kubelet/pods/70994d99-01f3-4875-ac56-63e2f31a8266/volumes" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.265715 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 02 08:45:35 crc kubenswrapper[4895]: E1202 08:45:35.266409 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70994d99-01f3-4875-ac56-63e2f31a8266" containerName="mariadb-client-1" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.266435 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="70994d99-01f3-4875-ac56-63e2f31a8266" containerName="mariadb-client-1" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.266846 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="70994d99-01f3-4875-ac56-63e2f31a8266" containerName="mariadb-client-1" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.267551 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.279712 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.427556 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dvw\" (UniqueName: \"kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw\") pod \"mariadb-client-4-default\" (UID: \"245f008e-b72b-4ab4-ad08-9cc89f39018b\") " pod="openstack/mariadb-client-4-default" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.446367 4895 scope.go:117] "RemoveContainer" containerID="e21a8c652bb01fe819ac06cfd20fdc2c4c844373697f522b561fac416c9aa8ab" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.446492 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.474046 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.474516 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.528733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dvw\" (UniqueName: \"kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw\") pod \"mariadb-client-4-default\" (UID: \"245f008e-b72b-4ab4-ad08-9cc89f39018b\") " pod="openstack/mariadb-client-4-default" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.551318 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dvw\" (UniqueName: \"kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw\") pod \"mariadb-client-4-default\" (UID: \"245f008e-b72b-4ab4-ad08-9cc89f39018b\") " pod="openstack/mariadb-client-4-default" Dec 02 08:45:35 crc kubenswrapper[4895]: I1202 08:45:35.610315 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 02 08:45:36 crc kubenswrapper[4895]: I1202 08:45:36.105649 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 02 08:45:36 crc kubenswrapper[4895]: W1202 08:45:36.109325 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod245f008e_b72b_4ab4_ad08_9cc89f39018b.slice/crio-acb19b554abbeafc505cbac352e5bf8d55fceaa1f56a362038698c3a4626ccb1 WatchSource:0}: Error finding container acb19b554abbeafc505cbac352e5bf8d55fceaa1f56a362038698c3a4626ccb1: Status 404 returned error can't find the container with id acb19b554abbeafc505cbac352e5bf8d55fceaa1f56a362038698c3a4626ccb1 Dec 02 08:45:36 crc kubenswrapper[4895]: I1202 08:45:36.455640 4895 generic.go:334] "Generic (PLEG): container finished" podID="245f008e-b72b-4ab4-ad08-9cc89f39018b" containerID="f878460e0b4a0877664a27db856ce73d083ed35fe3b1b80f06c55b474f033f48" exitCode=0 Dec 02 08:45:36 crc kubenswrapper[4895]: I1202 08:45:36.455673 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"245f008e-b72b-4ab4-ad08-9cc89f39018b","Type":"ContainerDied","Data":"f878460e0b4a0877664a27db856ce73d083ed35fe3b1b80f06c55b474f033f48"} Dec 02 08:45:36 crc kubenswrapper[4895]: I1202 08:45:36.455707 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"245f008e-b72b-4ab4-ad08-9cc89f39018b","Type":"ContainerStarted","Data":"acb19b554abbeafc505cbac352e5bf8d55fceaa1f56a362038698c3a4626ccb1"} Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.833821 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.857764 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_245f008e-b72b-4ab4-ad08-9cc89f39018b/mariadb-client-4-default/0.log" Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.888312 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.893276 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.964358 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87dvw\" (UniqueName: \"kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw\") pod \"245f008e-b72b-4ab4-ad08-9cc89f39018b\" (UID: \"245f008e-b72b-4ab4-ad08-9cc89f39018b\") " Dec 02 08:45:37 crc kubenswrapper[4895]: I1202 08:45:37.970540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw" (OuterVolumeSpecName: "kube-api-access-87dvw") pod "245f008e-b72b-4ab4-ad08-9cc89f39018b" (UID: "245f008e-b72b-4ab4-ad08-9cc89f39018b"). InnerVolumeSpecName "kube-api-access-87dvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:38 crc kubenswrapper[4895]: I1202 08:45:38.066325 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87dvw\" (UniqueName: \"kubernetes.io/projected/245f008e-b72b-4ab4-ad08-9cc89f39018b-kube-api-access-87dvw\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:38 crc kubenswrapper[4895]: I1202 08:45:38.486812 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acb19b554abbeafc505cbac352e5bf8d55fceaa1f56a362038698c3a4626ccb1" Dec 02 08:45:38 crc kubenswrapper[4895]: I1202 08:45:38.486904 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 02 08:45:39 crc kubenswrapper[4895]: I1202 08:45:39.150467 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245f008e-b72b-4ab4-ad08-9cc89f39018b" path="/var/lib/kubelet/pods/245f008e-b72b-4ab4-ad08-9cc89f39018b/volumes" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.470611 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 02 08:45:41 crc kubenswrapper[4895]: E1202 08:45:41.471181 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245f008e-b72b-4ab4-ad08-9cc89f39018b" containerName="mariadb-client-4-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.471195 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="245f008e-b72b-4ab4-ad08-9cc89f39018b" containerName="mariadb-client-4-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.471347 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="245f008e-b72b-4ab4-ad08-9cc89f39018b" containerName="mariadb-client-4-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.471910 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.476609 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-djw6d" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.480364 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.624675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbxd\" (UniqueName: \"kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd\") pod \"mariadb-client-5-default\" (UID: \"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794\") " pod="openstack/mariadb-client-5-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.726785 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbxd\" (UniqueName: \"kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd\") pod \"mariadb-client-5-default\" (UID: \"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794\") " pod="openstack/mariadb-client-5-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.744438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbxd\" (UniqueName: \"kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd\") pod \"mariadb-client-5-default\" (UID: \"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794\") " pod="openstack/mariadb-client-5-default" Dec 02 08:45:41 crc kubenswrapper[4895]: I1202 08:45:41.804701 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 02 08:45:42 crc kubenswrapper[4895]: I1202 08:45:42.318681 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 02 08:45:42 crc kubenswrapper[4895]: I1202 08:45:42.518170 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794","Type":"ContainerStarted","Data":"e50b9fac9d9d532e2a2aac9045cefcb8e025b42ce74d220bc98a83a8243ff0cc"} Dec 02 08:45:42 crc kubenswrapper[4895]: I1202 08:45:42.518236 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794","Type":"ContainerStarted","Data":"ec3b64dfb3240a5df60ee4a37d028b5ab92f8d31e0885a5cc20cfed1310116e4"} Dec 02 08:45:42 crc kubenswrapper[4895]: I1202 08:45:42.535711 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-5-default" podStartSLOduration=1.5356937579999999 podStartE2EDuration="1.535693758s" podCreationTimestamp="2025-12-02 08:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:45:42.53124012 +0000 UTC m=+4953.702099753" watchObservedRunningTime="2025-12-02 08:45:42.535693758 +0000 UTC m=+4953.706553371" Dec 02 08:45:42 crc kubenswrapper[4895]: I1202 08:45:42.577139 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794/mariadb-client-5-default/0.log" Dec 02 08:45:43 crc kubenswrapper[4895]: I1202 08:45:43.527188 4895 generic.go:334] "Generic (PLEG): container finished" podID="4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" containerID="e50b9fac9d9d532e2a2aac9045cefcb8e025b42ce74d220bc98a83a8243ff0cc" exitCode=0 Dec 02 08:45:43 crc kubenswrapper[4895]: I1202 08:45:43.527313 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794","Type":"ContainerDied","Data":"e50b9fac9d9d532e2a2aac9045cefcb8e025b42ce74d220bc98a83a8243ff0cc"} Dec 02 08:45:44 crc kubenswrapper[4895]: I1202 08:45:44.863026 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 02 08:45:44 crc kubenswrapper[4895]: I1202 08:45:44.906280 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 02 08:45:44 crc kubenswrapper[4895]: I1202 08:45:44.912711 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 02 08:45:44 crc kubenswrapper[4895]: I1202 08:45:44.973464 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twbxd\" (UniqueName: \"kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd\") pod \"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794\" (UID: \"4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794\") " Dec 02 08:45:44 crc kubenswrapper[4895]: I1202 08:45:44.979040 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd" (OuterVolumeSpecName: "kube-api-access-twbxd") pod "4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" (UID: "4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794"). InnerVolumeSpecName "kube-api-access-twbxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.038891 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 02 08:45:45 crc kubenswrapper[4895]: E1202 08:45:45.039438 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" containerName="mariadb-client-5-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.039457 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" containerName="mariadb-client-5-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.039698 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" containerName="mariadb-client-5-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.040529 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.059751 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.074619 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twbxd\" (UniqueName: \"kubernetes.io/projected/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794-kube-api-access-twbxd\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.152607 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794" path="/var/lib/kubelet/pods/4ed325fa-9a1a-45e8-aec2-c2a1e0a6c794/volumes" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.176604 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2dfv\" (UniqueName: \"kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv\") pod \"mariadb-client-6-default\" (UID: \"d5add3df-73ee-4852-ba12-dfb7c1766f20\") " pod="openstack/mariadb-client-6-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.278793 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2dfv\" (UniqueName: \"kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv\") pod \"mariadb-client-6-default\" (UID: \"d5add3df-73ee-4852-ba12-dfb7c1766f20\") " pod="openstack/mariadb-client-6-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.301602 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2dfv\" (UniqueName: \"kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv\") pod \"mariadb-client-6-default\" (UID: \"d5add3df-73ee-4852-ba12-dfb7c1766f20\") " pod="openstack/mariadb-client-6-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.368259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.547023 4895 scope.go:117] "RemoveContainer" containerID="e50b9fac9d9d532e2a2aac9045cefcb8e025b42ce74d220bc98a83a8243ff0cc" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.547071 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 02 08:45:45 crc kubenswrapper[4895]: I1202 08:45:45.877065 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 02 08:45:45 crc kubenswrapper[4895]: W1202 08:45:45.882158 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5add3df_73ee_4852_ba12_dfb7c1766f20.slice/crio-033284e0cdd3de941734e5b62c400cc96ed77e542a4b57a3b93a1cabb25c9d5d WatchSource:0}: Error finding container 033284e0cdd3de941734e5b62c400cc96ed77e542a4b57a3b93a1cabb25c9d5d: Status 404 returned error can't find the container with id 033284e0cdd3de941734e5b62c400cc96ed77e542a4b57a3b93a1cabb25c9d5d Dec 02 08:45:46 crc kubenswrapper[4895]: I1202 08:45:46.559902 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d5add3df-73ee-4852-ba12-dfb7c1766f20","Type":"ContainerStarted","Data":"ab345234fe9f0d1d421b4bbf8d00fd65397eafdb6e174992773fb2a75f5eb2ef"} Dec 02 08:45:46 crc kubenswrapper[4895]: I1202 08:45:46.560391 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d5add3df-73ee-4852-ba12-dfb7c1766f20","Type":"ContainerStarted","Data":"033284e0cdd3de941734e5b62c400cc96ed77e542a4b57a3b93a1cabb25c9d5d"} Dec 02 08:45:46 crc kubenswrapper[4895]: I1202 08:45:46.589346 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.589318368 podStartE2EDuration="1.589318368s" podCreationTimestamp="2025-12-02 08:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:45:46.581356301 +0000 UTC m=+4957.752215964" watchObservedRunningTime="2025-12-02 08:45:46.589318368 +0000 UTC m=+4957.760177991" Dec 02 08:45:46 crc kubenswrapper[4895]: I1202 08:45:46.639393 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_d5add3df-73ee-4852-ba12-dfb7c1766f20/mariadb-client-6-default/0.log" Dec 02 08:45:47 crc kubenswrapper[4895]: I1202 08:45:47.569650 4895 generic.go:334] "Generic (PLEG): container finished" podID="d5add3df-73ee-4852-ba12-dfb7c1766f20" containerID="ab345234fe9f0d1d421b4bbf8d00fd65397eafdb6e174992773fb2a75f5eb2ef" exitCode=1 Dec 02 08:45:47 crc kubenswrapper[4895]: I1202 08:45:47.569751 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d5add3df-73ee-4852-ba12-dfb7c1766f20","Type":"ContainerDied","Data":"ab345234fe9f0d1d421b4bbf8d00fd65397eafdb6e174992773fb2a75f5eb2ef"} Dec 02 08:45:48 crc kubenswrapper[4895]: I1202 08:45:48.990295 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.037236 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2dfv\" (UniqueName: \"kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv\") pod \"d5add3df-73ee-4852-ba12-dfb7c1766f20\" (UID: \"d5add3df-73ee-4852-ba12-dfb7c1766f20\") " Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.038837 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.044479 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.045660 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv" (OuterVolumeSpecName: "kube-api-access-n2dfv") pod "d5add3df-73ee-4852-ba12-dfb7c1766f20" (UID: "d5add3df-73ee-4852-ba12-dfb7c1766f20"). InnerVolumeSpecName "kube-api-access-n2dfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.140149 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2dfv\" (UniqueName: \"kubernetes.io/projected/d5add3df-73ee-4852-ba12-dfb7c1766f20-kube-api-access-n2dfv\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.154962 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5add3df-73ee-4852-ba12-dfb7c1766f20" path="/var/lib/kubelet/pods/d5add3df-73ee-4852-ba12-dfb7c1766f20/volumes" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.192994 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 02 08:45:49 crc kubenswrapper[4895]: E1202 08:45:49.193700 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5add3df-73ee-4852-ba12-dfb7c1766f20" containerName="mariadb-client-6-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.193753 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5add3df-73ee-4852-ba12-dfb7c1766f20" containerName="mariadb-client-6-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.194020 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5add3df-73ee-4852-ba12-dfb7c1766f20" containerName="mariadb-client-6-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.195067 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.198523 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.242385 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pct7q\" (UniqueName: \"kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q\") pod \"mariadb-client-7-default\" (UID: \"6fa8f398-1c61-4cd3-8c9d-694af60cbadd\") " pod="openstack/mariadb-client-7-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.344024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pct7q\" (UniqueName: \"kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q\") pod \"mariadb-client-7-default\" (UID: \"6fa8f398-1c61-4cd3-8c9d-694af60cbadd\") " pod="openstack/mariadb-client-7-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.364604 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pct7q\" (UniqueName: \"kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q\") pod \"mariadb-client-7-default\" (UID: \"6fa8f398-1c61-4cd3-8c9d-694af60cbadd\") " pod="openstack/mariadb-client-7-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.519336 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.591987 4895 scope.go:117] "RemoveContainer" containerID="ab345234fe9f0d1d421b4bbf8d00fd65397eafdb6e174992773fb2a75f5eb2ef" Dec 02 08:45:49 crc kubenswrapper[4895]: I1202 08:45:49.592064 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 02 08:45:50 crc kubenswrapper[4895]: I1202 08:45:50.011704 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 02 08:45:50 crc kubenswrapper[4895]: I1202 08:45:50.603259 4895 generic.go:334] "Generic (PLEG): container finished" podID="6fa8f398-1c61-4cd3-8c9d-694af60cbadd" containerID="e6945bf8ec19b3f33bfbbe32a1790d6f153dcdd29fea0d52bbb2ac72701756c3" exitCode=0 Dec 02 08:45:50 crc kubenswrapper[4895]: I1202 08:45:50.603391 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"6fa8f398-1c61-4cd3-8c9d-694af60cbadd","Type":"ContainerDied","Data":"e6945bf8ec19b3f33bfbbe32a1790d6f153dcdd29fea0d52bbb2ac72701756c3"} Dec 02 08:45:50 crc kubenswrapper[4895]: I1202 08:45:50.603602 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"6fa8f398-1c61-4cd3-8c9d-694af60cbadd","Type":"ContainerStarted","Data":"51acdea1489098eb667858aabcc35b3f2a5c53f36ee9c3aefc939a0ba234f904"} Dec 02 08:45:51 crc kubenswrapper[4895]: I1202 08:45:51.931700 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 02 08:45:51 crc kubenswrapper[4895]: I1202 08:45:51.949062 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_6fa8f398-1c61-4cd3-8c9d-694af60cbadd/mariadb-client-7-default/0.log" Dec 02 08:45:51 crc kubenswrapper[4895]: I1202 08:45:51.974967 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 02 08:45:51 crc kubenswrapper[4895]: I1202 08:45:51.983489 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 02 08:45:51 crc kubenswrapper[4895]: I1202 08:45:51.990356 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pct7q\" (UniqueName: \"kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q\") pod \"6fa8f398-1c61-4cd3-8c9d-694af60cbadd\" (UID: \"6fa8f398-1c61-4cd3-8c9d-694af60cbadd\") " Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.039055 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q" (OuterVolumeSpecName: "kube-api-access-pct7q") pod "6fa8f398-1c61-4cd3-8c9d-694af60cbadd" (UID: "6fa8f398-1c61-4cd3-8c9d-694af60cbadd"). InnerVolumeSpecName "kube-api-access-pct7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.092533 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pct7q\" (UniqueName: \"kubernetes.io/projected/6fa8f398-1c61-4cd3-8c9d-694af60cbadd-kube-api-access-pct7q\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.112165 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 02 08:45:52 crc kubenswrapper[4895]: E1202 08:45:52.112656 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa8f398-1c61-4cd3-8c9d-694af60cbadd" containerName="mariadb-client-7-default" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.112681 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa8f398-1c61-4cd3-8c9d-694af60cbadd" containerName="mariadb-client-7-default" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.112902 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa8f398-1c61-4cd3-8c9d-694af60cbadd" containerName="mariadb-client-7-default" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.113550 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.119276 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.194384 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpbks\" (UniqueName: \"kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks\") pod \"mariadb-client-2\" (UID: \"4c317329-bf77-4ed7-8238-0599e731bc77\") " pod="openstack/mariadb-client-2" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.295789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpbks\" (UniqueName: \"kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks\") pod \"mariadb-client-2\" (UID: \"4c317329-bf77-4ed7-8238-0599e731bc77\") " pod="openstack/mariadb-client-2" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.313132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpbks\" (UniqueName: \"kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks\") pod \"mariadb-client-2\" (UID: \"4c317329-bf77-4ed7-8238-0599e731bc77\") " pod="openstack/mariadb-client-2" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.467134 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.619261 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51acdea1489098eb667858aabcc35b3f2a5c53f36ee9c3aefc939a0ba234f904" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.619298 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 02 08:45:52 crc kubenswrapper[4895]: I1202 08:45:52.930480 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 02 08:45:52 crc kubenswrapper[4895]: W1202 08:45:52.934371 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c317329_bf77_4ed7_8238_0599e731bc77.slice/crio-87b03ee2718725958382b40f83f7ebd1b74be22f0ca32863cc26603b403fd9ae WatchSource:0}: Error finding container 87b03ee2718725958382b40f83f7ebd1b74be22f0ca32863cc26603b403fd9ae: Status 404 returned error can't find the container with id 87b03ee2718725958382b40f83f7ebd1b74be22f0ca32863cc26603b403fd9ae Dec 02 08:45:53 crc kubenswrapper[4895]: I1202 08:45:53.151250 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa8f398-1c61-4cd3-8c9d-694af60cbadd" path="/var/lib/kubelet/pods/6fa8f398-1c61-4cd3-8c9d-694af60cbadd/volumes" Dec 02 08:45:53 crc kubenswrapper[4895]: I1202 08:45:53.630593 4895 generic.go:334] "Generic (PLEG): container finished" podID="4c317329-bf77-4ed7-8238-0599e731bc77" containerID="0cc2dff8aa801ad5caa0a021877d182149ec1ef71cecc823a39956ab402ba2f2" exitCode=0 Dec 02 08:45:53 crc kubenswrapper[4895]: I1202 08:45:53.630678 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"4c317329-bf77-4ed7-8238-0599e731bc77","Type":"ContainerDied","Data":"0cc2dff8aa801ad5caa0a021877d182149ec1ef71cecc823a39956ab402ba2f2"} Dec 02 08:45:53 crc kubenswrapper[4895]: I1202 08:45:53.630728 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"4c317329-bf77-4ed7-8238-0599e731bc77","Type":"ContainerStarted","Data":"87b03ee2718725958382b40f83f7ebd1b74be22f0ca32863cc26603b403fd9ae"} Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.007971 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.025225 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_4c317329-bf77-4ed7-8238-0599e731bc77/mariadb-client-2/0.log" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.037114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpbks\" (UniqueName: \"kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks\") pod \"4c317329-bf77-4ed7-8238-0599e731bc77\" (UID: \"4c317329-bf77-4ed7-8238-0599e731bc77\") " Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.044097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks" (OuterVolumeSpecName: "kube-api-access-jpbks") pod "4c317329-bf77-4ed7-8238-0599e731bc77" (UID: "4c317329-bf77-4ed7-8238-0599e731bc77"). InnerVolumeSpecName "kube-api-access-jpbks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.055642 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.060938 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.139530 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpbks\" (UniqueName: \"kubernetes.io/projected/4c317329-bf77-4ed7-8238-0599e731bc77-kube-api-access-jpbks\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.151898 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c317329-bf77-4ed7-8238-0599e731bc77" path="/var/lib/kubelet/pods/4c317329-bf77-4ed7-8238-0599e731bc77/volumes" Dec 02 08:45:55 crc kubenswrapper[4895]: E1202 08:45:55.221164 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c317329_bf77_4ed7_8238_0599e731bc77.slice\": RecentStats: unable to find data in memory cache]" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.663427 4895 scope.go:117] "RemoveContainer" containerID="0cc2dff8aa801ad5caa0a021877d182149ec1ef71cecc823a39956ab402ba2f2" Dec 02 08:45:55 crc kubenswrapper[4895]: I1202 08:45:55.663454 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 02 08:46:05 crc kubenswrapper[4895]: I1202 08:46:05.474096 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:46:05 crc kubenswrapper[4895]: I1202 08:46:05.474722 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:46:28 crc kubenswrapper[4895]: I1202 08:46:28.948685 4895 scope.go:117] "RemoveContainer" containerID="0ebdb8aa2004d72423d7de16f89e0c5be4bb4adf06c82756f17f7df43fafee24" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.473448 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.474089 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.474148 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.474896 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.474961 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" gracePeriod=600 Dec 02 08:46:35 crc kubenswrapper[4895]: E1202 08:46:35.609579 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.986358 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" exitCode=0 Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.986408 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875"} Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.986452 4895 scope.go:117] "RemoveContainer" containerID="32bf7d392743b71deb119b4fd3e6dd2e4aeb7c86e6abc8aa43066f6a5cc4af85" Dec 02 08:46:35 crc kubenswrapper[4895]: I1202 08:46:35.987010 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:46:35 crc kubenswrapper[4895]: E1202 08:46:35.987318 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:46:50 crc kubenswrapper[4895]: I1202 08:46:50.140707 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:46:50 crc kubenswrapper[4895]: E1202 08:46:50.141570 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:47:05 crc kubenswrapper[4895]: I1202 08:47:05.141382 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:47:05 crc kubenswrapper[4895]: E1202 08:47:05.142139 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:47:19 crc kubenswrapper[4895]: I1202 08:47:19.147616 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:47:19 crc kubenswrapper[4895]: E1202 08:47:19.150620 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.758051 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:21 crc kubenswrapper[4895]: E1202 08:47:21.759072 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c317329-bf77-4ed7-8238-0599e731bc77" containerName="mariadb-client-2" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.759090 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c317329-bf77-4ed7-8238-0599e731bc77" containerName="mariadb-client-2" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.759267 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c317329-bf77-4ed7-8238-0599e731bc77" containerName="mariadb-client-2" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.761267 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.769199 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.893621 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.893682 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.893834 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc7ww\" (UniqueName: \"kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.994789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7ww\" (UniqueName: \"kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.995110 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.995251 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.995720 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:21 crc kubenswrapper[4895]: I1202 08:47:21.995757 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:22 crc kubenswrapper[4895]: I1202 08:47:22.017195 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc7ww\" (UniqueName: \"kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww\") pod \"redhat-operators-9l8mm\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:22 crc kubenswrapper[4895]: I1202 08:47:22.092925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:22 crc kubenswrapper[4895]: I1202 08:47:22.612404 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:23 crc kubenswrapper[4895]: I1202 08:47:23.387753 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerID="8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1" exitCode=0 Dec 02 08:47:23 crc kubenswrapper[4895]: I1202 08:47:23.387852 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerDied","Data":"8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1"} Dec 02 08:47:23 crc kubenswrapper[4895]: I1202 08:47:23.388038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerStarted","Data":"c775ef84ad0ef134f4ecd7467db8bbf8bc967da0773284480b33691d21f173f4"} Dec 02 08:47:23 crc kubenswrapper[4895]: I1202 08:47:23.389773 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:47:25 crc kubenswrapper[4895]: I1202 08:47:25.413218 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerID="64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e" exitCode=0 Dec 02 08:47:25 crc kubenswrapper[4895]: I1202 08:47:25.413432 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerDied","Data":"64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e"} Dec 02 08:47:27 crc kubenswrapper[4895]: I1202 08:47:27.435726 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerStarted","Data":"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b"} Dec 02 08:47:27 crc kubenswrapper[4895]: I1202 08:47:27.472445 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9l8mm" podStartSLOduration=3.065852546 podStartE2EDuration="6.472427412s" podCreationTimestamp="2025-12-02 08:47:21 +0000 UTC" firstStartedPulling="2025-12-02 08:47:23.38947311 +0000 UTC m=+5054.560332743" lastFinishedPulling="2025-12-02 08:47:26.796047956 +0000 UTC m=+5057.966907609" observedRunningTime="2025-12-02 08:47:27.469318165 +0000 UTC m=+5058.640177778" watchObservedRunningTime="2025-12-02 08:47:27.472427412 +0000 UTC m=+5058.643287025" Dec 02 08:47:32 crc kubenswrapper[4895]: I1202 08:47:32.094163 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:32 crc kubenswrapper[4895]: I1202 08:47:32.094504 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:33 crc kubenswrapper[4895]: I1202 08:47:33.137291 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9l8mm" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="registry-server" probeResult="failure" output=< Dec 02 08:47:33 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 08:47:33 crc kubenswrapper[4895]: > Dec 02 08:47:33 crc kubenswrapper[4895]: I1202 08:47:33.141124 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:47:33 crc kubenswrapper[4895]: E1202 08:47:33.141471 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:47:42 crc kubenswrapper[4895]: I1202 08:47:42.161283 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:42 crc kubenswrapper[4895]: I1202 08:47:42.209664 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:42 crc kubenswrapper[4895]: I1202 08:47:42.402467 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:43 crc kubenswrapper[4895]: I1202 08:47:43.584456 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9l8mm" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="registry-server" containerID="cri-o://a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b" gracePeriod=2 Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.007212 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.042858 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities\") pod \"a4f43124-15db-4bfb-be8a-475936ba39a1\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.042922 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content\") pod \"a4f43124-15db-4bfb-be8a-475936ba39a1\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.043007 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc7ww\" (UniqueName: \"kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww\") pod \"a4f43124-15db-4bfb-be8a-475936ba39a1\" (UID: \"a4f43124-15db-4bfb-be8a-475936ba39a1\") " Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.043926 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities" (OuterVolumeSpecName: "utilities") pod "a4f43124-15db-4bfb-be8a-475936ba39a1" (UID: "a4f43124-15db-4bfb-be8a-475936ba39a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.049998 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww" (OuterVolumeSpecName: "kube-api-access-kc7ww") pod "a4f43124-15db-4bfb-be8a-475936ba39a1" (UID: "a4f43124-15db-4bfb-be8a-475936ba39a1"). InnerVolumeSpecName "kube-api-access-kc7ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.145581 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.145998 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc7ww\" (UniqueName: \"kubernetes.io/projected/a4f43124-15db-4bfb-be8a-475936ba39a1-kube-api-access-kc7ww\") on node \"crc\" DevicePath \"\"" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.165494 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4f43124-15db-4bfb-be8a-475936ba39a1" (UID: "a4f43124-15db-4bfb-be8a-475936ba39a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.248608 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4f43124-15db-4bfb-be8a-475936ba39a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.598802 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerID="a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b" exitCode=0 Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.598902 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerDied","Data":"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b"} Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.598973 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l8mm" event={"ID":"a4f43124-15db-4bfb-be8a-475936ba39a1","Type":"ContainerDied","Data":"c775ef84ad0ef134f4ecd7467db8bbf8bc967da0773284480b33691d21f173f4"} Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.599030 4895 scope.go:117] "RemoveContainer" containerID="a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.599991 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l8mm" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.636249 4895 scope.go:117] "RemoveContainer" containerID="64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.657415 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.671293 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9l8mm"] Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.676223 4895 scope.go:117] "RemoveContainer" containerID="8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.702815 4895 scope.go:117] "RemoveContainer" containerID="a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b" Dec 02 08:47:44 crc kubenswrapper[4895]: E1202 08:47:44.703446 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b\": container with ID starting with a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b not found: ID does not exist" containerID="a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.703490 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b"} err="failed to get container status \"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b\": rpc error: code = NotFound desc = could not find container \"a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b\": container with ID starting with a8062f8e54ef37ca5b662528ddc3e1ede1736f1d4a39d66c73d08af39baaa73b not found: ID does not exist" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.703519 4895 scope.go:117] "RemoveContainer" containerID="64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e" Dec 02 08:47:44 crc kubenswrapper[4895]: E1202 08:47:44.703972 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e\": container with ID starting with 64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e not found: ID does not exist" containerID="64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.704039 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e"} err="failed to get container status \"64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e\": rpc error: code = NotFound desc = could not find container \"64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e\": container with ID starting with 64a4bcd26bdd551824767b05e19e4abd74fd10feb7812c1718d95d7570612d6e not found: ID does not exist" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.704069 4895 scope.go:117] "RemoveContainer" containerID="8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1" Dec 02 08:47:44 crc kubenswrapper[4895]: E1202 08:47:44.704406 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1\": container with ID starting with 8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1 not found: ID does not exist" containerID="8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1" Dec 02 08:47:44 crc kubenswrapper[4895]: I1202 08:47:44.704493 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1"} err="failed to get container status \"8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1\": rpc error: code = NotFound desc = could not find container \"8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1\": container with ID starting with 8f48f67b7becbb94c42914ab238ed50ffb53da2f52e4e8aef595e8d5ace500b1 not found: ID does not exist" Dec 02 08:47:45 crc kubenswrapper[4895]: I1202 08:47:45.150631 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" path="/var/lib/kubelet/pods/a4f43124-15db-4bfb-be8a-475936ba39a1/volumes" Dec 02 08:47:48 crc kubenswrapper[4895]: I1202 08:47:48.141206 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:47:48 crc kubenswrapper[4895]: E1202 08:47:48.141758 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:48:01 crc kubenswrapper[4895]: I1202 08:48:01.142759 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:48:01 crc kubenswrapper[4895]: E1202 08:48:01.143801 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:48:12 crc kubenswrapper[4895]: I1202 08:48:12.141317 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:48:12 crc kubenswrapper[4895]: E1202 08:48:12.142288 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:48:26 crc kubenswrapper[4895]: I1202 08:48:26.141826 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:48:26 crc kubenswrapper[4895]: E1202 08:48:26.142578 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:48:38 crc kubenswrapper[4895]: I1202 08:48:38.141319 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:48:38 crc kubenswrapper[4895]: E1202 08:48:38.142841 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:48:49 crc kubenswrapper[4895]: I1202 08:48:49.146186 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:48:49 crc kubenswrapper[4895]: E1202 08:48:49.147088 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:49:03 crc kubenswrapper[4895]: I1202 08:49:03.140622 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:49:03 crc kubenswrapper[4895]: E1202 08:49:03.142489 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:49:18 crc kubenswrapper[4895]: I1202 08:49:18.141297 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:49:18 crc kubenswrapper[4895]: E1202 08:49:18.142177 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:49:29 crc kubenswrapper[4895]: I1202 08:49:29.082945 4895 scope.go:117] "RemoveContainer" containerID="b0a958345e4af3f2d1e5d20fc711fbdf4183afd4ada1b51690aba8078f78c414" Dec 02 08:49:29 crc kubenswrapper[4895]: I1202 08:49:29.107595 4895 scope.go:117] "RemoveContainer" containerID="352f5de4e4d4933767fcc536b93ce8208cf14541ac655d13aaafef6f6af0b208" Dec 02 08:49:30 crc kubenswrapper[4895]: I1202 08:49:30.141265 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:49:30 crc kubenswrapper[4895]: E1202 08:49:30.141857 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:49:44 crc kubenswrapper[4895]: I1202 08:49:44.141017 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:49:44 crc kubenswrapper[4895]: E1202 08:49:44.141881 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:49:55 crc kubenswrapper[4895]: I1202 08:49:55.140899 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:49:55 crc kubenswrapper[4895]: E1202 08:49:55.141835 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.146572 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:50:09 crc kubenswrapper[4895]: E1202 08:50:09.147334 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.492952 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 08:50:09 crc kubenswrapper[4895]: E1202 08:50:09.493727 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="extract-utilities" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.493800 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="extract-utilities" Dec 02 08:50:09 crc kubenswrapper[4895]: E1202 08:50:09.493831 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="extract-content" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.493840 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="extract-content" Dec 02 08:50:09 crc kubenswrapper[4895]: E1202 08:50:09.493863 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="registry-server" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.493872 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="registry-server" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.494071 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f43124-15db-4bfb-be8a-475936ba39a1" containerName="registry-server" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.494688 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.496670 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-djw6d" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.499647 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.622769 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjg7c\" (UniqueName: \"kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.622875 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.725016 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjg7c\" (UniqueName: \"kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.725123 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.730762 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.730811 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8409c4c9aeca49d5d11d958d1d107e63ea7aa7181cf4490e9d396893f3c45396/globalmount\"" pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.753956 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjg7c\" (UniqueName: \"kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.760790 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") pod \"mariadb-copy-data\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " pod="openstack/mariadb-copy-data" Dec 02 08:50:09 crc kubenswrapper[4895]: I1202 08:50:09.829168 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 02 08:50:10 crc kubenswrapper[4895]: I1202 08:50:10.341206 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 08:50:10 crc kubenswrapper[4895]: I1202 08:50:10.822633 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"59279a72-fa91-4e40-ac44-f52fa931e496","Type":"ContainerStarted","Data":"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a"} Dec 02 08:50:10 crc kubenswrapper[4895]: I1202 08:50:10.823188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"59279a72-fa91-4e40-ac44-f52fa931e496","Type":"ContainerStarted","Data":"aa26abe2c5c22a9e93dd5a9fdf7dd3e27580fdab941ee210dd81f22d731378f0"} Dec 02 08:50:10 crc kubenswrapper[4895]: I1202 08:50:10.839970 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.839934289 podStartE2EDuration="2.839934289s" podCreationTimestamp="2025-12-02 08:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:10.838891577 +0000 UTC m=+5222.009751220" watchObservedRunningTime="2025-12-02 08:50:10.839934289 +0000 UTC m=+5222.010793922" Dec 02 08:50:13 crc kubenswrapper[4895]: I1202 08:50:13.842122 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:13 crc kubenswrapper[4895]: I1202 08:50:13.845178 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:13 crc kubenswrapper[4895]: I1202 08:50:13.852276 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.007770 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zl5\" (UniqueName: \"kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5\") pod \"mariadb-client\" (UID: \"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2\") " pod="openstack/mariadb-client" Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.109299 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zl5\" (UniqueName: \"kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5\") pod \"mariadb-client\" (UID: \"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2\") " pod="openstack/mariadb-client" Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.129699 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zl5\" (UniqueName: \"kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5\") pod \"mariadb-client\" (UID: \"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2\") " pod="openstack/mariadb-client" Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.171017 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.597343 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:14 crc kubenswrapper[4895]: W1202 08:50:14.601824 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode188bd25_764e_49b9_b9e8_f39a3bb5c7c2.slice/crio-2f516c57225e8e65760cbdb369e4a16e0d13a4db1318101cac0840f2d1371092 WatchSource:0}: Error finding container 2f516c57225e8e65760cbdb369e4a16e0d13a4db1318101cac0840f2d1371092: Status 404 returned error can't find the container with id 2f516c57225e8e65760cbdb369e4a16e0d13a4db1318101cac0840f2d1371092 Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.859308 4895 generic.go:334] "Generic (PLEG): container finished" podID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" containerID="60d04769a21fc7f67db7408c348add2dd04715b258ae02675f19d22715307928" exitCode=0 Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.859357 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2","Type":"ContainerDied","Data":"60d04769a21fc7f67db7408c348add2dd04715b258ae02675f19d22715307928"} Dec 02 08:50:14 crc kubenswrapper[4895]: I1202 08:50:14.859817 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2","Type":"ContainerStarted","Data":"2f516c57225e8e65760cbdb369e4a16e0d13a4db1318101cac0840f2d1371092"} Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.201083 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.231465 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_e188bd25-764e-49b9-b9e8-f39a3bb5c7c2/mariadb-client/0.log" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.269490 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.277212 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.345169 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zl5\" (UniqueName: \"kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5\") pod \"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2\" (UID: \"e188bd25-764e-49b9-b9e8-f39a3bb5c7c2\") " Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.352912 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5" (OuterVolumeSpecName: "kube-api-access-x7zl5") pod "e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" (UID: "e188bd25-764e-49b9-b9e8-f39a3bb5c7c2"). InnerVolumeSpecName "kube-api-access-x7zl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.447885 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zl5\" (UniqueName: \"kubernetes.io/projected/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2-kube-api-access-x7zl5\") on node \"crc\" DevicePath \"\"" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.465813 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:16 crc kubenswrapper[4895]: E1202 08:50:16.468617 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" containerName="mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.468645 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" containerName="mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.468919 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" containerName="mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.469877 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.479094 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.549476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pg8d\" (UniqueName: \"kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d\") pod \"mariadb-client\" (UID: \"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4\") " pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.650583 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pg8d\" (UniqueName: \"kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d\") pod \"mariadb-client\" (UID: \"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4\") " pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.668950 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pg8d\" (UniqueName: \"kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d\") pod \"mariadb-client\" (UID: \"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4\") " pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.792916 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.879752 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f516c57225e8e65760cbdb369e4a16e0d13a4db1318101cac0840f2d1371092" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.879823 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:16 crc kubenswrapper[4895]: I1202 08:50:16.899474 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" podUID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" Dec 02 08:50:17 crc kubenswrapper[4895]: I1202 08:50:17.154071 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e188bd25-764e-49b9-b9e8-f39a3bb5c7c2" path="/var/lib/kubelet/pods/e188bd25-764e-49b9-b9e8-f39a3bb5c7c2/volumes" Dec 02 08:50:17 crc kubenswrapper[4895]: I1202 08:50:17.243406 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:17 crc kubenswrapper[4895]: W1202 08:50:17.248557 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25f7428e_7530_4407_ba4c_eaf0fd7fbfa4.slice/crio-404aea4e915f37a32dfee432198e6a4e179cea2c2fb48aa9f06e52fb8acfdab3 WatchSource:0}: Error finding container 404aea4e915f37a32dfee432198e6a4e179cea2c2fb48aa9f06e52fb8acfdab3: Status 404 returned error can't find the container with id 404aea4e915f37a32dfee432198e6a4e179cea2c2fb48aa9f06e52fb8acfdab3 Dec 02 08:50:17 crc kubenswrapper[4895]: I1202 08:50:17.892861 4895 generic.go:334] "Generic (PLEG): container finished" podID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" containerID="681615d548d5595f7ec8bb18dd9cab1a7c2d65ef97f90121bfe2acdef2bf07b9" exitCode=0 Dec 02 08:50:17 crc kubenswrapper[4895]: I1202 08:50:17.892936 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4","Type":"ContainerDied","Data":"681615d548d5595f7ec8bb18dd9cab1a7c2d65ef97f90121bfe2acdef2bf07b9"} Dec 02 08:50:17 crc kubenswrapper[4895]: I1202 08:50:17.892989 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4","Type":"ContainerStarted","Data":"404aea4e915f37a32dfee432198e6a4e179cea2c2fb48aa9f06e52fb8acfdab3"} Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.250076 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.282508 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_25f7428e-7530-4407-ba4c-eaf0fd7fbfa4/mariadb-client/0.log" Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.294713 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pg8d\" (UniqueName: \"kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d\") pod \"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4\" (UID: \"25f7428e-7530-4407-ba4c-eaf0fd7fbfa4\") " Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.307576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d" (OuterVolumeSpecName: "kube-api-access-9pg8d") pod "25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" (UID: "25f7428e-7530-4407-ba4c-eaf0fd7fbfa4"). InnerVolumeSpecName "kube-api-access-9pg8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.320938 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.335194 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.396941 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pg8d\" (UniqueName: \"kubernetes.io/projected/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4-kube-api-access-9pg8d\") on node \"crc\" DevicePath \"\"" Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.932309 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404aea4e915f37a32dfee432198e6a4e179cea2c2fb48aa9f06e52fb8acfdab3" Dec 02 08:50:19 crc kubenswrapper[4895]: I1202 08:50:19.932369 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 02 08:50:21 crc kubenswrapper[4895]: I1202 08:50:21.155439 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" path="/var/lib/kubelet/pods/25f7428e-7530-4407-ba4c-eaf0fd7fbfa4/volumes" Dec 02 08:50:24 crc kubenswrapper[4895]: I1202 08:50:24.141814 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:50:24 crc kubenswrapper[4895]: E1202 08:50:24.142396 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.036566 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:38 crc kubenswrapper[4895]: E1202 08:50:38.038434 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" containerName="mariadb-client" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.038464 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" containerName="mariadb-client" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.038825 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f7428e-7530-4407-ba4c-eaf0fd7fbfa4" containerName="mariadb-client" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.041401 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.057941 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.058960 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.059081 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.059168 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x9hq\" (UniqueName: \"kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.160881 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x9hq\" (UniqueName: \"kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.160983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.161050 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.161636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.161709 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.182820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x9hq\" (UniqueName: \"kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq\") pod \"community-operators-sbmzx\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.381607 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:38 crc kubenswrapper[4895]: I1202 08:50:38.914069 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:39 crc kubenswrapper[4895]: I1202 08:50:39.147041 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:50:39 crc kubenswrapper[4895]: E1202 08:50:39.147667 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:50:39 crc kubenswrapper[4895]: I1202 08:50:39.161246 4895 generic.go:334] "Generic (PLEG): container finished" podID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerID="66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341" exitCode=0 Dec 02 08:50:39 crc kubenswrapper[4895]: I1202 08:50:39.162264 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerDied","Data":"66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341"} Dec 02 08:50:39 crc kubenswrapper[4895]: I1202 08:50:39.162347 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerStarted","Data":"6ef0b0ee931a2e291b5c6af6bd7f249f61bc57b05c943611ae809d4f2622c5fa"} Dec 02 08:50:40 crc kubenswrapper[4895]: I1202 08:50:40.170519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerStarted","Data":"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4"} Dec 02 08:50:41 crc kubenswrapper[4895]: I1202 08:50:41.182321 4895 generic.go:334] "Generic (PLEG): container finished" podID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerID="61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4" exitCode=0 Dec 02 08:50:41 crc kubenswrapper[4895]: I1202 08:50:41.182466 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerDied","Data":"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4"} Dec 02 08:50:42 crc kubenswrapper[4895]: I1202 08:50:42.193722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerStarted","Data":"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1"} Dec 02 08:50:42 crc kubenswrapper[4895]: I1202 08:50:42.226606 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sbmzx" podStartSLOduration=2.6957745810000002 podStartE2EDuration="5.226573655s" podCreationTimestamp="2025-12-02 08:50:37 +0000 UTC" firstStartedPulling="2025-12-02 08:50:39.163573494 +0000 UTC m=+5250.334433117" lastFinishedPulling="2025-12-02 08:50:41.694372578 +0000 UTC m=+5252.865232191" observedRunningTime="2025-12-02 08:50:42.215506661 +0000 UTC m=+5253.386366284" watchObservedRunningTime="2025-12-02 08:50:42.226573655 +0000 UTC m=+5253.397433358" Dec 02 08:50:48 crc kubenswrapper[4895]: I1202 08:50:48.382374 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:48 crc kubenswrapper[4895]: I1202 08:50:48.384680 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:48 crc kubenswrapper[4895]: I1202 08:50:48.447559 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:49 crc kubenswrapper[4895]: I1202 08:50:49.323138 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:49 crc kubenswrapper[4895]: I1202 08:50:49.405420 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:51 crc kubenswrapper[4895]: I1202 08:50:51.141330 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:50:51 crc kubenswrapper[4895]: E1202 08:50:51.141717 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:50:51 crc kubenswrapper[4895]: I1202 08:50:51.287150 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sbmzx" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="registry-server" containerID="cri-o://1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1" gracePeriod=2 Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.269576 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.300422 4895 generic.go:334] "Generic (PLEG): container finished" podID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerID="1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1" exitCode=0 Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.300504 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerDied","Data":"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1"} Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.300570 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbmzx" event={"ID":"dd110df3-0ac9-4e17-ad1c-881e6a5c9565","Type":"ContainerDied","Data":"6ef0b0ee931a2e291b5c6af6bd7f249f61bc57b05c943611ae809d4f2622c5fa"} Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.300604 4895 scope.go:117] "RemoveContainer" containerID="1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.300654 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbmzx" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.345204 4895 scope.go:117] "RemoveContainer" containerID="61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.350668 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content\") pod \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.351115 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x9hq\" (UniqueName: \"kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq\") pod \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.351226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities\") pod \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\" (UID: \"dd110df3-0ac9-4e17-ad1c-881e6a5c9565\") " Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.354112 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities" (OuterVolumeSpecName: "utilities") pod "dd110df3-0ac9-4e17-ad1c-881e6a5c9565" (UID: "dd110df3-0ac9-4e17-ad1c-881e6a5c9565"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.362956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq" (OuterVolumeSpecName: "kube-api-access-2x9hq") pod "dd110df3-0ac9-4e17-ad1c-881e6a5c9565" (UID: "dd110df3-0ac9-4e17-ad1c-881e6a5c9565"). InnerVolumeSpecName "kube-api-access-2x9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.378165 4895 scope.go:117] "RemoveContainer" containerID="66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.416449 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd110df3-0ac9-4e17-ad1c-881e6a5c9565" (UID: "dd110df3-0ac9-4e17-ad1c-881e6a5c9565"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.454861 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x9hq\" (UniqueName: \"kubernetes.io/projected/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-kube-api-access-2x9hq\") on node \"crc\" DevicePath \"\"" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.454907 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.454921 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd110df3-0ac9-4e17-ad1c-881e6a5c9565-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.480711 4895 scope.go:117] "RemoveContainer" containerID="1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1" Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.481459 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1\": container with ID starting with 1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1 not found: ID does not exist" containerID="1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.481576 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1"} err="failed to get container status \"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1\": rpc error: code = NotFound desc = could not find container \"1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1\": container with ID starting with 1a0cf4e0da3d016b4396e748d41771890ac9dec8dfda8866aa9b988dc7bb68c1 not found: ID does not exist" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.481687 4895 scope.go:117] "RemoveContainer" containerID="61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.482229 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.482249 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4\": container with ID starting with 61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4 not found: ID does not exist" containerID="61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.482449 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4"} err="failed to get container status \"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4\": rpc error: code = NotFound desc = could not find container \"61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4\": container with ID starting with 61840b26958b44597fcf8b81e40ad0f9f5fe559467d3bf5545c2f7571f211ba4 not found: ID does not exist" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.482491 4895 scope.go:117] "RemoveContainer" containerID="66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341" Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.482989 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341\": container with ID starting with 66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341 not found: ID does not exist" containerID="66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.483085 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341"} err="failed to get container status \"66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341\": rpc error: code = NotFound desc = could not find container \"66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341\": container with ID starting with 66f37d2552e5d4750493a2502ba09cfeab0a9c586edc0eee04958f10138ca341 not found: ID does not exist" Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.483023 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="registry-server" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.483161 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="registry-server" Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.483231 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="extract-utilities" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.483244 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="extract-utilities" Dec 02 08:50:52 crc kubenswrapper[4895]: E1202 08:50:52.483327 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="extract-content" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.483342 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="extract-content" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.483949 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" containerName="registry-server" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.485897 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.491280 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.493290 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.499373 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.510873 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rgxj2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.511195 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.511559 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.515611 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.519025 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.559601 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.567437 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.637882 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.644971 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sbmzx"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659580 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659647 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659672 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcnk\" (UniqueName: \"kubernetes.io/projected/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-kube-api-access-xhcnk\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36385a11-0e9f-41c4-a386-ff3710a53b75-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659721 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzpgc\" (UniqueName: \"kubernetes.io/projected/78b393f8-7861-4b27-af3d-4a70cd2afa7e-kube-api-access-mzpgc\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659796 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-config\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659814 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659833 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.659855 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27324e8d-3d47-413d-881e-e2c37b637a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27324e8d-3d47-413d-881e-e2c37b637a96\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660246 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78b393f8-7861-4b27-af3d-4a70cd2afa7e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660330 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4eb2098e-8556-4156-a491-9eabc7759f22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4eb2098e-8556-4156-a491-9eabc7759f22\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660394 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b393f8-7861-4b27-af3d-4a70cd2afa7e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660462 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36385a11-0e9f-41c4-a386-ff3710a53b75-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660628 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlbg\" (UniqueName: \"kubernetes.io/projected/36385a11-0e9f-41c4-a386-ff3710a53b75-kube-api-access-gzlbg\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-config\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.660719 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.676393 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.678647 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.682401 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.682769 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kv4mw" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.685660 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.694248 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.707145 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.708784 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.718547 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.720838 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.725770 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.744421 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762162 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762228 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762274 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762324 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27324e8d-3d47-413d-881e-e2c37b637a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27324e8d-3d47-413d-881e-e2c37b637a96\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762390 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78b393f8-7861-4b27-af3d-4a70cd2afa7e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4eb2098e-8556-4156-a491-9eabc7759f22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4eb2098e-8556-4156-a491-9eabc7759f22\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762483 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-config\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b393f8-7861-4b27-af3d-4a70cd2afa7e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762569 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762614 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36385a11-0e9f-41c4-a386-ff3710a53b75-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762668 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762708 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gvj\" (UniqueName: \"kubernetes.io/projected/a799ab29-f667-4d1f-af0f-9d0123379f79-kube-api-access-v2gvj\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlbg\" (UniqueName: \"kubernetes.io/projected/36385a11-0e9f-41c4-a386-ff3710a53b75-kube-api-access-gzlbg\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762839 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-config\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762874 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a799ab29-f667-4d1f-af0f-9d0123379f79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.762965 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a799ab29-f667-4d1f-af0f-9d0123379f79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763061 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763096 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcnk\" (UniqueName: \"kubernetes.io/projected/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-kube-api-access-xhcnk\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763151 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36385a11-0e9f-41c4-a386-ff3710a53b75-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzpgc\" (UniqueName: \"kubernetes.io/projected/78b393f8-7861-4b27-af3d-4a70cd2afa7e-kube-api-access-mzpgc\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763227 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763270 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-config\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.763380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.764117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.764148 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b393f8-7861-4b27-af3d-4a70cd2afa7e-config\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.764667 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36385a11-0e9f-41c4-a386-ff3710a53b75-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.765012 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.765277 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.765971 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766341 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766369 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27324e8d-3d47-413d-881e-e2c37b637a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27324e8d-3d47-413d-881e-e2c37b637a96\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1eb4eb09c99b6d25ca5a309779e6e31f3dd559f9645c12978a3829f04ab364d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766397 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766425 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce8fe06cd79447c41cb8d2f8f6018d2b9fe2e578322b0f86238db4f698da65de/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766557 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.766592 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4eb2098e-8556-4156-a491-9eabc7759f22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4eb2098e-8556-4156-a491-9eabc7759f22\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/50f772dd209cbf7d5381af639cfb19e06d5c7ad633b8d048b25d107b07f3ec83/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.767734 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78b393f8-7861-4b27-af3d-4a70cd2afa7e-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.769261 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36385a11-0e9f-41c4-a386-ff3710a53b75-config\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.772882 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36385a11-0e9f-41c4-a386-ff3710a53b75-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.777579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b393f8-7861-4b27-af3d-4a70cd2afa7e-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.782359 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.786059 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcnk\" (UniqueName: \"kubernetes.io/projected/e09ee80b-8154-4dfb-8dd9-df40a3aded0a-kube-api-access-xhcnk\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.787946 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlbg\" (UniqueName: \"kubernetes.io/projected/36385a11-0e9f-41c4-a386-ff3710a53b75-kube-api-access-gzlbg\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.789504 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzpgc\" (UniqueName: \"kubernetes.io/projected/78b393f8-7861-4b27-af3d-4a70cd2afa7e-kube-api-access-mzpgc\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.821235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fcdf9c1d-288e-4f5c-93a0-bee9d02069ed\") pod \"ovsdbserver-sb-1\" (UID: \"36385a11-0e9f-41c4-a386-ff3710a53b75\") " pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.821250 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4eb2098e-8556-4156-a491-9eabc7759f22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4eb2098e-8556-4156-a491-9eabc7759f22\") pod \"ovsdbserver-sb-2\" (UID: \"78b393f8-7861-4b27-af3d-4a70cd2afa7e\") " pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.830169 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27324e8d-3d47-413d-881e-e2c37b637a96\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27324e8d-3d47-413d-881e-e2c37b637a96\") pod \"ovsdbserver-sb-0\" (UID: \"e09ee80b-8154-4dfb-8dd9-df40a3aded0a\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.856683 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864417 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55lp\" (UniqueName: \"kubernetes.io/projected/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-kube-api-access-r55lp\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864484 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-config\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864543 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864590 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-config\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864955 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4cb\" (UniqueName: \"kubernetes.io/projected/f6a3fdc9-294e-403e-b1b3-178a47b3c692-kube-api-access-7r4cb\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.864985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865055 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gvj\" (UniqueName: \"kubernetes.io/projected/a799ab29-f667-4d1f-af0f-9d0123379f79-kube-api-access-v2gvj\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-config\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a3fdc9-294e-403e-b1b3-178a47b3c692-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865140 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a799ab29-f667-4d1f-af0f-9d0123379f79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865198 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a799ab29-f667-4d1f-af0f-9d0123379f79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865259 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865295 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865331 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865368 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f6a3fdc9-294e-403e-b1b3-178a47b3c692-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865946 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a799ab29-f667-4d1f-af0f-9d0123379f79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.866725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.865363 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a799ab29-f667-4d1f-af0f-9d0123379f79-config\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.866990 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.870197 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a799ab29-f667-4d1f-af0f-9d0123379f79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.871157 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.871219 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b3efe51317aff732d390f0ce0bccc04f4a2e751c2f39c091fda21af9c7d4adc8/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.874125 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.888576 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gvj\" (UniqueName: \"kubernetes.io/projected/a799ab29-f667-4d1f-af0f-9d0123379f79-kube-api-access-v2gvj\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.913763 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96a03fb0-064d-4cd8-9096-75ba63b23d27\") pod \"ovsdbserver-nb-0\" (UID: \"a799ab29-f667-4d1f-af0f-9d0123379f79\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.968608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.968974 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f6a3fdc9-294e-403e-b1b3-178a47b3c692-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.968998 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969035 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55lp\" (UniqueName: \"kubernetes.io/projected/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-kube-api-access-r55lp\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969055 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969069 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-config\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969102 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4cb\" (UniqueName: \"kubernetes.io/projected/f6a3fdc9-294e-403e-b1b3-178a47b3c692-kube-api-access-7r4cb\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969150 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969209 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-config\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.969230 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a3fdc9-294e-403e-b1b3-178a47b3c692-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.970042 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f6a3fdc9-294e-403e-b1b3-178a47b3c692-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.970412 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.971724 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.972068 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.972729 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a3fdc9-294e-403e-b1b3-178a47b3c692-config\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.973174 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-config\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.974386 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a3fdc9-294e-403e-b1b3-178a47b3c692-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.975238 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.977229 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.977303 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/11e517db25791293019939db0be2a9ccbb73c6a6a04e69cc897bf8b5174d4688/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.978653 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.978692 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f31798462e1814d1f4361b78a020ade2181f607897c4914684874e3ff078db2a/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.992550 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4cb\" (UniqueName: \"kubernetes.io/projected/f6a3fdc9-294e-403e-b1b3-178a47b3c692-kube-api-access-7r4cb\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:52 crc kubenswrapper[4895]: I1202 08:50:52.996311 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:52.999997 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55lp\" (UniqueName: \"kubernetes.io/projected/2e1d668e-3d4a-43a1-9fa4-8a1f478aa316-kube-api-access-r55lp\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.046340 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-abc71ae0-d3d2-4847-aaab-f7521f72fa03\") pod \"ovsdbserver-nb-2\" (UID: \"f6a3fdc9-294e-403e-b1b3-178a47b3c692\") " pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.069678 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c01bf820-5367-4d3e-831b-a9ee0713221e\") pod \"ovsdbserver-nb-1\" (UID: \"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316\") " pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.133215 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.152757 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd110df3-0ac9-4e17-ad1c-881e6a5c9565" path="/var/lib/kubelet/pods/dd110df3-0ac9-4e17-ad1c-881e6a5c9565/volumes" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.332031 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.339551 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.462776 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 02 08:50:53 crc kubenswrapper[4895]: W1202 08:50:53.570287 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b393f8_7861_4b27_af3d_4a70cd2afa7e.slice/crio-2eb0893232f2afbce2c9cf9b46f73ebcf070d5eac446ff7319284ca003392a8b WatchSource:0}: Error finding container 2eb0893232f2afbce2c9cf9b46f73ebcf070d5eac446ff7319284ca003392a8b: Status 404 returned error can't find the container with id 2eb0893232f2afbce2c9cf9b46f73ebcf070d5eac446ff7319284ca003392a8b Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.572487 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.637753 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:50:53 crc kubenswrapper[4895]: W1202 08:50:53.644837 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda799ab29_f667_4d1f_af0f_9d0123379f79.slice/crio-11dfeecba83ff4583bc407c97d52f53d04b6d7d27904cfe6924f6cffd62c2191 WatchSource:0}: Error finding container 11dfeecba83ff4583bc407c97d52f53d04b6d7d27904cfe6924f6cffd62c2191: Status 404 returned error can't find the container with id 11dfeecba83ff4583bc407c97d52f53d04b6d7d27904cfe6924f6cffd62c2191 Dec 02 08:50:53 crc kubenswrapper[4895]: I1202 08:50:53.944994 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.047237 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:50:54 crc kubenswrapper[4895]: W1202 08:50:54.050837 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09ee80b_8154_4dfb_8dd9_df40a3aded0a.slice/crio-fb5d4ebb18b0fffe58d95ed86bc120f342f874f54ff633eba0e5e99b00f51b99 WatchSource:0}: Error finding container fb5d4ebb18b0fffe58d95ed86bc120f342f874f54ff633eba0e5e99b00f51b99: Status 404 returned error can't find the container with id fb5d4ebb18b0fffe58d95ed86bc120f342f874f54ff633eba0e5e99b00f51b99 Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.319456 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"36385a11-0e9f-41c4-a386-ff3710a53b75","Type":"ContainerStarted","Data":"6cb0ee82626e56984808575cdb0557afe5fb501002fc2b299213281af508f826"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.319498 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"36385a11-0e9f-41c4-a386-ff3710a53b75","Type":"ContainerStarted","Data":"ab530e69ca73310905b1cd470132a5ef10af4f1a2f59da56b570bf7968e5ef2a"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.319508 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"36385a11-0e9f-41c4-a386-ff3710a53b75","Type":"ContainerStarted","Data":"f63d3030c6a823805dba866f8866927bf6cf57630912819f8c5ed212d0b97db3"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.324490 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a799ab29-f667-4d1f-af0f-9d0123379f79","Type":"ContainerStarted","Data":"edccdf777c5b26dc0a01556dda29b0f3c553b9d08a8d34341496570cecaa2218"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.324523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a799ab29-f667-4d1f-af0f-9d0123379f79","Type":"ContainerStarted","Data":"deb6707147bb17237783a474fe0df8c9c0766540bb13d0b2f6fb8929c4c06e81"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.324534 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a799ab29-f667-4d1f-af0f-9d0123379f79","Type":"ContainerStarted","Data":"11dfeecba83ff4583bc407c97d52f53d04b6d7d27904cfe6924f6cffd62c2191"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.327134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316","Type":"ContainerStarted","Data":"ab83d88b0060245e63fa1a7582315e9a0c06f337cc450587a0f600e661f7a124"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.327166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316","Type":"ContainerStarted","Data":"4ad807c708bfc6172d3fb0a2fff2d58b75758c0a2352110e507e0384fb55dbab"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.327177 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"2e1d668e-3d4a-43a1-9fa4-8a1f478aa316","Type":"ContainerStarted","Data":"55c7ce2538b02c9851fd6e708851138dfe5a4d9776964dee2d5c075aafd7011f"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.330329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e09ee80b-8154-4dfb-8dd9-df40a3aded0a","Type":"ContainerStarted","Data":"682aa8056536a4fa0f25aea8e522f7c9c0d2b9d4ad029ab6264162007c7ad8be"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.330368 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e09ee80b-8154-4dfb-8dd9-df40a3aded0a","Type":"ContainerStarted","Data":"fb5d4ebb18b0fffe58d95ed86bc120f342f874f54ff633eba0e5e99b00f51b99"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.332778 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78b393f8-7861-4b27-af3d-4a70cd2afa7e","Type":"ContainerStarted","Data":"61b3fbcbe714fa04fbc1ba8169c7bd5f86943eac75e7a0b09876ba7d71757403"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.332815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78b393f8-7861-4b27-af3d-4a70cd2afa7e","Type":"ContainerStarted","Data":"1b1401f6b8a09cc7b08f14a37619437a80bbd8f54a23e4d5c5c1d7e93982c42e"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.332827 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"78b393f8-7861-4b27-af3d-4a70cd2afa7e","Type":"ContainerStarted","Data":"2eb0893232f2afbce2c9cf9b46f73ebcf070d5eac446ff7319284ca003392a8b"} Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.356465 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.356436497 podStartE2EDuration="3.356436497s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:54.341996588 +0000 UTC m=+5265.512856211" watchObservedRunningTime="2025-12-02 08:50:54.356436497 +0000 UTC m=+5265.527296110" Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.371760 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.371709422 podStartE2EDuration="3.371709422s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:54.364581671 +0000 UTC m=+5265.535441294" watchObservedRunningTime="2025-12-02 08:50:54.371709422 +0000 UTC m=+5265.542569035" Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.390688 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.390664762 podStartE2EDuration="3.390664762s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:54.384720807 +0000 UTC m=+5265.555580410" watchObservedRunningTime="2025-12-02 08:50:54.390664762 +0000 UTC m=+5265.561524375" Dec 02 08:50:54 crc kubenswrapper[4895]: I1202 08:50:54.425659 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.425624829 podStartE2EDuration="3.425624829s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:54.406770163 +0000 UTC m=+5265.577629776" watchObservedRunningTime="2025-12-02 08:50:54.425624829 +0000 UTC m=+5265.596484452" Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.017296 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 02 08:50:55 crc kubenswrapper[4895]: W1202 08:50:55.017421 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a3fdc9_294e_403e_b1b3_178a47b3c692.slice/crio-1da76ee3fbd7ce96b9582a03dc0d8b0d92898e7275c7365a7d51e962b661e03d WatchSource:0}: Error finding container 1da76ee3fbd7ce96b9582a03dc0d8b0d92898e7275c7365a7d51e962b661e03d: Status 404 returned error can't find the container with id 1da76ee3fbd7ce96b9582a03dc0d8b0d92898e7275c7365a7d51e962b661e03d Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.357120 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e09ee80b-8154-4dfb-8dd9-df40a3aded0a","Type":"ContainerStarted","Data":"2d0f3de0e6300fc42ef1ef7c9e613bd025bd045585ba5ab1738cfb5ec0863fbf"} Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.359932 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f6a3fdc9-294e-403e-b1b3-178a47b3c692","Type":"ContainerStarted","Data":"0800f647ded3d7e97e8da9ef0dadb02721fff7ed62793ed4a762ac1ed3f789a6"} Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.359984 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f6a3fdc9-294e-403e-b1b3-178a47b3c692","Type":"ContainerStarted","Data":"1da76ee3fbd7ce96b9582a03dc0d8b0d92898e7275c7365a7d51e962b661e03d"} Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.378651 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.378629968 podStartE2EDuration="4.378629968s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:55.374938483 +0000 UTC m=+5266.545798106" watchObservedRunningTime="2025-12-02 08:50:55.378629968 +0000 UTC m=+5266.549489581" Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.858643 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.875452 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:55 crc kubenswrapper[4895]: I1202 08:50:55.997783 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:56 crc kubenswrapper[4895]: I1202 08:50:56.133446 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:56 crc kubenswrapper[4895]: I1202 08:50:56.332250 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:56 crc kubenswrapper[4895]: I1202 08:50:56.384185 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f6a3fdc9-294e-403e-b1b3-178a47b3c692","Type":"ContainerStarted","Data":"691a82c43341b839cd44aa6ee709ae88a27d52e7e649e2c6f1f855a07655216d"} Dec 02 08:50:56 crc kubenswrapper[4895]: I1202 08:50:56.426388 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.426354842 podStartE2EDuration="5.426354842s" podCreationTimestamp="2025-12-02 08:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:50:56.416073312 +0000 UTC m=+5267.586932975" watchObservedRunningTime="2025-12-02 08:50:56.426354842 +0000 UTC m=+5267.597214455" Dec 02 08:50:57 crc kubenswrapper[4895]: I1202 08:50:57.858103 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:57 crc kubenswrapper[4895]: I1202 08:50:57.875221 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:57 crc kubenswrapper[4895]: I1202 08:50:57.997091 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.134170 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.333020 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.340303 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.902450 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.907723 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.952400 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 02 08:50:58 crc kubenswrapper[4895]: I1202 08:50:58.963942 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.054142 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.126022 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.176161 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.188956 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.194071 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.208852 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.217585 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.294721 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.294807 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjg7\" (UniqueName: \"kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.294877 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.295014 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.340427 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.393098 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.394332 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.396759 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.396887 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.396925 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjg7\" (UniqueName: \"kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.396968 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.398152 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.398818 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.399211 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.437857 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjg7\" (UniqueName: \"kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7\") pod \"dnsmasq-dns-578bcccf49-d2hnk\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.476062 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.479703 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.481058 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.493426 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.495315 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.495502 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.498182 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.513845 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.615242 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.615298 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.615595 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.615719 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx9fz\" (UniqueName: \"kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.615936 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.718440 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx9fz\" (UniqueName: \"kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.719154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.719184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.719212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.719270 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.720607 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.720930 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.721343 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.722289 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.739109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx9fz\" (UniqueName: \"kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz\") pod \"dnsmasq-dns-68fdf4b965-f6gm5\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:50:59 crc kubenswrapper[4895]: I1202 08:50:59.914668 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.004470 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:51:00 crc kubenswrapper[4895]: W1202 08:51:00.007342 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd12f3cf_71eb_48c7_afeb_b30cc06c6720.slice/crio-d5c925117d07ec2ca5f20dbe0c069dfbd10cb9de480683643e24084f46c57789 WatchSource:0}: Error finding container d5c925117d07ec2ca5f20dbe0c069dfbd10cb9de480683643e24084f46c57789: Status 404 returned error can't find the container with id d5c925117d07ec2ca5f20dbe0c069dfbd10cb9de480683643e24084f46c57789 Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.207893 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:51:00 crc kubenswrapper[4895]: W1202 08:51:00.212902 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1389de2c_a59b_4963_ab49_824c6df666d1.slice/crio-bcf6c5e7eaa6f4244e26d6920bcd5cbfba207f1bbeb9eaec15c2faa4ff0022ab WatchSource:0}: Error finding container bcf6c5e7eaa6f4244e26d6920bcd5cbfba207f1bbeb9eaec15c2faa4ff0022ab: Status 404 returned error can't find the container with id bcf6c5e7eaa6f4244e26d6920bcd5cbfba207f1bbeb9eaec15c2faa4ff0022ab Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.427909 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerStarted","Data":"9c77d4c808e5f41a0e36134b657ad33c15d6f6445daaaac293e1f103d79d10f4"} Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.427995 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerStarted","Data":"bcf6c5e7eaa6f4244e26d6920bcd5cbfba207f1bbeb9eaec15c2faa4ff0022ab"} Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.431423 4895 generic.go:334] "Generic (PLEG): container finished" podID="bd12f3cf-71eb-48c7-afeb-b30cc06c6720" containerID="7214bc1aa5166f1a902ad2c6c68a9057171fa7f8e2c9865cf35d9321b4e2df89" exitCode=0 Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.431505 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" event={"ID":"bd12f3cf-71eb-48c7-afeb-b30cc06c6720","Type":"ContainerDied","Data":"7214bc1aa5166f1a902ad2c6c68a9057171fa7f8e2c9865cf35d9321b4e2df89"} Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.435454 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" event={"ID":"bd12f3cf-71eb-48c7-afeb-b30cc06c6720","Type":"ContainerStarted","Data":"d5c925117d07ec2ca5f20dbe0c069dfbd10cb9de480683643e24084f46c57789"} Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.508875 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.833476 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.957166 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") pod \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.957361 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfjg7\" (UniqueName: \"kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7\") pod \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.957395 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc\") pod \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.957439 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config\") pod \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.963450 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7" (OuterVolumeSpecName: "kube-api-access-pfjg7") pod "bd12f3cf-71eb-48c7-afeb-b30cc06c6720" (UID: "bd12f3cf-71eb-48c7-afeb-b30cc06c6720"). InnerVolumeSpecName "kube-api-access-pfjg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.980389 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd12f3cf-71eb-48c7-afeb-b30cc06c6720" (UID: "bd12f3cf-71eb-48c7-afeb-b30cc06c6720"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:00 crc kubenswrapper[4895]: E1202 08:51:00.981416 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb podName:bd12f3cf-71eb-48c7-afeb-b30cc06c6720 nodeName:}" failed. No retries permitted until 2025-12-02 08:51:01.48137779 +0000 UTC m=+5272.652237403 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb") pod "bd12f3cf-71eb-48c7-afeb-b30cc06c6720" (UID: "bd12f3cf-71eb-48c7-afeb-b30cc06c6720") : error deleting /var/lib/kubelet/pods/bd12f3cf-71eb-48c7-afeb-b30cc06c6720/volume-subpaths: remove /var/lib/kubelet/pods/bd12f3cf-71eb-48c7-afeb-b30cc06c6720/volume-subpaths: no such file or directory Dec 02 08:51:00 crc kubenswrapper[4895]: I1202 08:51:00.981656 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config" (OuterVolumeSpecName: "config") pod "bd12f3cf-71eb-48c7-afeb-b30cc06c6720" (UID: "bd12f3cf-71eb-48c7-afeb-b30cc06c6720"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.060432 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfjg7\" (UniqueName: \"kubernetes.io/projected/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-kube-api-access-pfjg7\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.060474 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.060485 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.444201 4895 generic.go:334] "Generic (PLEG): container finished" podID="1389de2c-a59b-4963-ab49-824c6df666d1" containerID="9c77d4c808e5f41a0e36134b657ad33c15d6f6445daaaac293e1f103d79d10f4" exitCode=0 Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.444290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerDied","Data":"9c77d4c808e5f41a0e36134b657ad33c15d6f6445daaaac293e1f103d79d10f4"} Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.447939 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.447927 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578bcccf49-d2hnk" event={"ID":"bd12f3cf-71eb-48c7-afeb-b30cc06c6720","Type":"ContainerDied","Data":"d5c925117d07ec2ca5f20dbe0c069dfbd10cb9de480683643e24084f46c57789"} Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.448173 4895 scope.go:117] "RemoveContainer" containerID="7214bc1aa5166f1a902ad2c6c68a9057171fa7f8e2c9865cf35d9321b4e2df89" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.572726 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") pod \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\" (UID: \"bd12f3cf-71eb-48c7-afeb-b30cc06c6720\") " Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.574706 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd12f3cf-71eb-48c7-afeb-b30cc06c6720" (UID: "bd12f3cf-71eb-48c7-afeb-b30cc06c6720"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.674840 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd12f3cf-71eb-48c7-afeb-b30cc06c6720-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.853486 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:51:01 crc kubenswrapper[4895]: I1202 08:51:01.859808 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578bcccf49-d2hnk"] Dec 02 08:51:02 crc kubenswrapper[4895]: I1202 08:51:02.142149 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:51:02 crc kubenswrapper[4895]: E1202 08:51:02.143106 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:51:02 crc kubenswrapper[4895]: I1202 08:51:02.461883 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerStarted","Data":"a1f45b47ec81a501d51a19f21177dfb8c3b2bc94ddff39d3fe5634b12d509be8"} Dec 02 08:51:02 crc kubenswrapper[4895]: I1202 08:51:02.462488 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:51:02 crc kubenswrapper[4895]: I1202 08:51:02.495169 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" podStartSLOduration=3.495136184 podStartE2EDuration="3.495136184s" podCreationTimestamp="2025-12-02 08:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:02.489214029 +0000 UTC m=+5273.660073662" watchObservedRunningTime="2025-12-02 08:51:02.495136184 +0000 UTC m=+5273.665995807" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.160028 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd12f3cf-71eb-48c7-afeb-b30cc06c6720" path="/var/lib/kubelet/pods/bd12f3cf-71eb-48c7-afeb-b30cc06c6720/volumes" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.479248 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 02 08:51:03 crc kubenswrapper[4895]: E1202 08:51:03.479671 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd12f3cf-71eb-48c7-afeb-b30cc06c6720" containerName="init" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.479689 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd12f3cf-71eb-48c7-afeb-b30cc06c6720" containerName="init" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.479961 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd12f3cf-71eb-48c7-afeb-b30cc06c6720" containerName="init" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.480662 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.487412 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.497710 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.615578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.615692 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.615776 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.717855 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.717938 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.717970 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.723097 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.723155 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/57a9d52ec2fe74a21a6c6b6a27189c624012b911a2e91e10fd26030f0cbb3f0e/globalmount\"" pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.731961 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.756525 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.782849 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") pod \"ovn-copy-data\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " pod="openstack/ovn-copy-data" Dec 02 08:51:03 crc kubenswrapper[4895]: I1202 08:51:03.819258 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 02 08:51:04 crc kubenswrapper[4895]: I1202 08:51:04.417617 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 02 08:51:04 crc kubenswrapper[4895]: I1202 08:51:04.491844 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"1b36ab26-6b37-4af5-bafe-35ef3c888ea9","Type":"ContainerStarted","Data":"554d77df12aa559d3405ab66ffef7756667388cf42d2939f9d381efac894f53c"} Dec 02 08:51:05 crc kubenswrapper[4895]: I1202 08:51:05.505562 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"1b36ab26-6b37-4af5-bafe-35ef3c888ea9","Type":"ContainerStarted","Data":"922ca77dd580b2b692105a955f4ea6aec16dfc8119cb2bf53760f1b64a2e2119"} Dec 02 08:51:05 crc kubenswrapper[4895]: I1202 08:51:05.531257 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.531237777 podStartE2EDuration="3.531237777s" podCreationTimestamp="2025-12-02 08:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:05.525840249 +0000 UTC m=+5276.696699892" watchObservedRunningTime="2025-12-02 08:51:05.531237777 +0000 UTC m=+5276.702097390" Dec 02 08:51:09 crc kubenswrapper[4895]: I1202 08:51:09.916922 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:51:09 crc kubenswrapper[4895]: I1202 08:51:09.992338 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:51:09 crc kubenswrapper[4895]: I1202 08:51:09.992894 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="dnsmasq-dns" containerID="cri-o://a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767" gracePeriod=10 Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.539733 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.558426 4895 generic.go:334] "Generic (PLEG): container finished" podID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerID="a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767" exitCode=0 Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.558530 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.558503 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" event={"ID":"b8201768-d804-4993-bf5a-8e81c0be77d0","Type":"ContainerDied","Data":"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767"} Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.558840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-rffv6" event={"ID":"b8201768-d804-4993-bf5a-8e81c0be77d0","Type":"ContainerDied","Data":"b750817bf6c9b95d40b25b4247570e66d947b40aa32e73c02fa9f96b5f62c1c7"} Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.558867 4895 scope.go:117] "RemoveContainer" containerID="a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.593439 4895 scope.go:117] "RemoveContainer" containerID="deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.639001 4895 scope.go:117] "RemoveContainer" containerID="a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767" Dec 02 08:51:10 crc kubenswrapper[4895]: E1202 08:51:10.639582 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767\": container with ID starting with a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767 not found: ID does not exist" containerID="a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.639655 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767"} err="failed to get container status \"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767\": rpc error: code = NotFound desc = could not find container \"a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767\": container with ID starting with a6319ba5ee6d16baaba31e0ba7839f456a8ec4d3be0d0d1469a8822429b5b767 not found: ID does not exist" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.639684 4895 scope.go:117] "RemoveContainer" containerID="deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9" Dec 02 08:51:10 crc kubenswrapper[4895]: E1202 08:51:10.640156 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9\": container with ID starting with deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9 not found: ID does not exist" containerID="deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.640186 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9"} err="failed to get container status \"deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9\": rpc error: code = NotFound desc = could not find container \"deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9\": container with ID starting with deb8889c5c1dc400f16a10c7a3794c3c01a4819d62d6e9e35144caadf15c48d9 not found: ID does not exist" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.677471 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config\") pod \"b8201768-d804-4993-bf5a-8e81c0be77d0\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.677523 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc\") pod \"b8201768-d804-4993-bf5a-8e81c0be77d0\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.677696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq5hl\" (UniqueName: \"kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl\") pod \"b8201768-d804-4993-bf5a-8e81c0be77d0\" (UID: \"b8201768-d804-4993-bf5a-8e81c0be77d0\") " Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.684683 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl" (OuterVolumeSpecName: "kube-api-access-cq5hl") pod "b8201768-d804-4993-bf5a-8e81c0be77d0" (UID: "b8201768-d804-4993-bf5a-8e81c0be77d0"). InnerVolumeSpecName "kube-api-access-cq5hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.726682 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config" (OuterVolumeSpecName: "config") pod "b8201768-d804-4993-bf5a-8e81c0be77d0" (UID: "b8201768-d804-4993-bf5a-8e81c0be77d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.728107 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8201768-d804-4993-bf5a-8e81c0be77d0" (UID: "b8201768-d804-4993-bf5a-8e81c0be77d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.779797 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.779834 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq5hl\" (UniqueName: \"kubernetes.io/projected/b8201768-d804-4993-bf5a-8e81c0be77d0-kube-api-access-cq5hl\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.779846 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8201768-d804-4993-bf5a-8e81c0be77d0-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.823775 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:51:10 crc kubenswrapper[4895]: E1202 08:51:10.824295 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="dnsmasq-dns" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.824323 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="dnsmasq-dns" Dec 02 08:51:10 crc kubenswrapper[4895]: E1202 08:51:10.824369 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="init" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.824381 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="init" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.824588 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" containerName="dnsmasq-dns" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.825922 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.836158 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.836238 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-sh2zr" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.836308 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.859939 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.911874 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.921325 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-rffv6"] Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.982822 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-config\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.982892 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.982927 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6f7\" (UniqueName: \"kubernetes.io/projected/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-kube-api-access-cn6f7\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.982990 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-scripts\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:10 crc kubenswrapper[4895]: I1202 08:51:10.983037 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.085172 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-config\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.086184 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-config\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.086534 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.086799 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6f7\" (UniqueName: \"kubernetes.io/projected/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-kube-api-access-cn6f7\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.087315 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-scripts\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.088047 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-scripts\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.088198 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.088646 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.093317 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.104350 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6f7\" (UniqueName: \"kubernetes.io/projected/7ee53cb1-8853-4c8d-a58a-beba1d7cbea4-kube-api-access-cn6f7\") pod \"ovn-northd-0\" (UID: \"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4\") " pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.159309 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8201768-d804-4993-bf5a-8e81c0be77d0" path="/var/lib/kubelet/pods/b8201768-d804-4993-bf5a-8e81c0be77d0/volumes" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.159305 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 08:51:11 crc kubenswrapper[4895]: I1202 08:51:11.691310 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:51:12 crc kubenswrapper[4895]: I1202 08:51:12.587482 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4","Type":"ContainerStarted","Data":"caefb9875a19710041f478f8ced16f5ad369b5827cf76df60b62592179392eff"} Dec 02 08:51:12 crc kubenswrapper[4895]: I1202 08:51:12.588050 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 08:51:12 crc kubenswrapper[4895]: I1202 08:51:12.588066 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4","Type":"ContainerStarted","Data":"7724606cfad58dbc12b71aa1d2d269029c9850c3c42498d43c5ee09648947b2b"} Dec 02 08:51:12 crc kubenswrapper[4895]: I1202 08:51:12.588078 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7ee53cb1-8853-4c8d-a58a-beba1d7cbea4","Type":"ContainerStarted","Data":"3f6e966c17f50445d99397b9f64f5a9f95a327dd52f19e40d1882a0bb65c7fe2"} Dec 02 08:51:12 crc kubenswrapper[4895]: I1202 08:51:12.628428 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.628393621 podStartE2EDuration="2.628393621s" podCreationTimestamp="2025-12-02 08:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:12.613458356 +0000 UTC m=+5283.784317979" watchObservedRunningTime="2025-12-02 08:51:12.628393621 +0000 UTC m=+5283.799253234" Dec 02 08:51:13 crc kubenswrapper[4895]: I1202 08:51:13.142026 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:51:13 crc kubenswrapper[4895]: E1202 08:51:13.143054 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.354051 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-r6xf4"] Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.355740 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.372884 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5328-account-create-update-js8vj"] Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.374251 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.376461 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.385965 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r6xf4"] Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.393160 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5328-account-create-update-js8vj"] Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.498622 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kjh7\" (UniqueName: \"kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.498684 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6ssx\" (UniqueName: \"kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.498784 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.498825 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.600420 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.600527 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kjh7\" (UniqueName: \"kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.600553 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6ssx\" (UniqueName: \"kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.600612 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.601378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.602250 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.620945 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kjh7\" (UniqueName: \"kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7\") pod \"keystone-db-create-r6xf4\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.626991 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6ssx\" (UniqueName: \"kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx\") pod \"keystone-5328-account-create-update-js8vj\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.690880 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:16 crc kubenswrapper[4895]: I1202 08:51:16.700652 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.131364 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-r6xf4"] Dec 02 08:51:17 crc kubenswrapper[4895]: W1202 08:51:17.139697 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7ae511_f6f1_4369_92ef_2314d47f0bc9.slice/crio-599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53 WatchSource:0}: Error finding container 599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53: Status 404 returned error can't find the container with id 599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53 Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.196827 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5328-account-create-update-js8vj"] Dec 02 08:51:17 crc kubenswrapper[4895]: W1202 08:51:17.206553 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00e12a0b_dbf2_4e96_ae0e_b6c71cdd9ca2.slice/crio-6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe WatchSource:0}: Error finding container 6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe: Status 404 returned error can't find the container with id 6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.635954 4895 generic.go:334] "Generic (PLEG): container finished" podID="dc7ae511-f6f1-4369-92ef-2314d47f0bc9" containerID="2dbf99bfd146113519ce737354e8a75a92110def9bd511e5dca1d8d1ae7e3434" exitCode=0 Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.636060 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r6xf4" event={"ID":"dc7ae511-f6f1-4369-92ef-2314d47f0bc9","Type":"ContainerDied","Data":"2dbf99bfd146113519ce737354e8a75a92110def9bd511e5dca1d8d1ae7e3434"} Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.636236 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r6xf4" event={"ID":"dc7ae511-f6f1-4369-92ef-2314d47f0bc9","Type":"ContainerStarted","Data":"599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53"} Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.638124 4895 generic.go:334] "Generic (PLEG): container finished" podID="00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" containerID="f87ceeedd521a992123bb2260bb4ac98491bd716dbc831a0b8357008ccf19d6b" exitCode=0 Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.638172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5328-account-create-update-js8vj" event={"ID":"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2","Type":"ContainerDied","Data":"f87ceeedd521a992123bb2260bb4ac98491bd716dbc831a0b8357008ccf19d6b"} Dec 02 08:51:17 crc kubenswrapper[4895]: I1202 08:51:17.638202 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5328-account-create-update-js8vj" event={"ID":"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2","Type":"ContainerStarted","Data":"6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe"} Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.106178 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.112889 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.243101 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts\") pod \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.243172 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kjh7\" (UniqueName: \"kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7\") pod \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.243255 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6ssx\" (UniqueName: \"kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx\") pod \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\" (UID: \"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2\") " Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.243429 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts\") pod \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\" (UID: \"dc7ae511-f6f1-4369-92ef-2314d47f0bc9\") " Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.244192 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" (UID: "00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.244477 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc7ae511-f6f1-4369-92ef-2314d47f0bc9" (UID: "dc7ae511-f6f1-4369-92ef-2314d47f0bc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.244995 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.245155 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.251119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx" (OuterVolumeSpecName: "kube-api-access-t6ssx") pod "00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" (UID: "00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2"). InnerVolumeSpecName "kube-api-access-t6ssx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.251169 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7" (OuterVolumeSpecName: "kube-api-access-7kjh7") pod "dc7ae511-f6f1-4369-92ef-2314d47f0bc9" (UID: "dc7ae511-f6f1-4369-92ef-2314d47f0bc9"). InnerVolumeSpecName "kube-api-access-7kjh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.347041 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kjh7\" (UniqueName: \"kubernetes.io/projected/dc7ae511-f6f1-4369-92ef-2314d47f0bc9-kube-api-access-7kjh7\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.347085 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6ssx\" (UniqueName: \"kubernetes.io/projected/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2-kube-api-access-t6ssx\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.670906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-r6xf4" event={"ID":"dc7ae511-f6f1-4369-92ef-2314d47f0bc9","Type":"ContainerDied","Data":"599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53"} Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.671322 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="599ac8163a4e267eb32da1d042f7b96c5447683c688d48f01f6590fcf92c0e53" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.670951 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-r6xf4" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.673549 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5328-account-create-update-js8vj" event={"ID":"00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2","Type":"ContainerDied","Data":"6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe"} Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.673593 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5328-account-create-update-js8vj" Dec 02 08:51:19 crc kubenswrapper[4895]: I1202 08:51:19.673604 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa138c983a596fd70c7a098fbe5ff2e6168a73eda2ce308c533d9711e0931fe" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.212063 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.858677 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-scqpw"] Dec 02 08:51:21 crc kubenswrapper[4895]: E1202 08:51:21.859046 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7ae511-f6f1-4369-92ef-2314d47f0bc9" containerName="mariadb-database-create" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.859058 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7ae511-f6f1-4369-92ef-2314d47f0bc9" containerName="mariadb-database-create" Dec 02 08:51:21 crc kubenswrapper[4895]: E1202 08:51:21.859071 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" containerName="mariadb-account-create-update" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.859077 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" containerName="mariadb-account-create-update" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.859229 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7ae511-f6f1-4369-92ef-2314d47f0bc9" containerName="mariadb-database-create" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.859243 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" containerName="mariadb-account-create-update" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.859819 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.863992 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.864763 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.871429 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-njttf" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.871429 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.882207 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-scqpw"] Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.997106 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.997611 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:21 crc kubenswrapper[4895]: I1202 08:51:21.998124 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czq67\" (UniqueName: \"kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.099849 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.099949 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.100021 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czq67\" (UniqueName: \"kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.107113 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.107297 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.115894 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czq67\" (UniqueName: \"kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67\") pod \"keystone-db-sync-scqpw\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.179868 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:22 crc kubenswrapper[4895]: I1202 08:51:22.697860 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-scqpw"] Dec 02 08:51:23 crc kubenswrapper[4895]: I1202 08:51:23.709461 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-scqpw" event={"ID":"9a6dd260-7a03-4151-a57a-ed65069db068","Type":"ContainerStarted","Data":"8af73bd21f4459f704ff31f24c06903e8ad8671a2957bf5bcb32c908b4f977db"} Dec 02 08:51:23 crc kubenswrapper[4895]: I1202 08:51:23.709821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-scqpw" event={"ID":"9a6dd260-7a03-4151-a57a-ed65069db068","Type":"ContainerStarted","Data":"1dd4c0e3b9e4bf5af205d0f52109f878e3b5e6dc12d04785c8e8a126ba5fb9b0"} Dec 02 08:51:23 crc kubenswrapper[4895]: I1202 08:51:23.754127 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-scqpw" podStartSLOduration=2.754096584 podStartE2EDuration="2.754096584s" podCreationTimestamp="2025-12-02 08:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:23.738351875 +0000 UTC m=+5294.909211498" watchObservedRunningTime="2025-12-02 08:51:23.754096584 +0000 UTC m=+5294.924956237" Dec 02 08:51:24 crc kubenswrapper[4895]: I1202 08:51:24.718977 4895 generic.go:334] "Generic (PLEG): container finished" podID="9a6dd260-7a03-4151-a57a-ed65069db068" containerID="8af73bd21f4459f704ff31f24c06903e8ad8671a2957bf5bcb32c908b4f977db" exitCode=0 Dec 02 08:51:24 crc kubenswrapper[4895]: I1202 08:51:24.719025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-scqpw" event={"ID":"9a6dd260-7a03-4151-a57a-ed65069db068","Type":"ContainerDied","Data":"8af73bd21f4459f704ff31f24c06903e8ad8671a2957bf5bcb32c908b4f977db"} Dec 02 08:51:25 crc kubenswrapper[4895]: I1202 08:51:25.143532 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:51:25 crc kubenswrapper[4895]: E1202 08:51:25.144048 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.182471 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.283633 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data\") pod \"9a6dd260-7a03-4151-a57a-ed65069db068\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.283932 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle\") pod \"9a6dd260-7a03-4151-a57a-ed65069db068\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.283992 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czq67\" (UniqueName: \"kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67\") pod \"9a6dd260-7a03-4151-a57a-ed65069db068\" (UID: \"9a6dd260-7a03-4151-a57a-ed65069db068\") " Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.292132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67" (OuterVolumeSpecName: "kube-api-access-czq67") pod "9a6dd260-7a03-4151-a57a-ed65069db068" (UID: "9a6dd260-7a03-4151-a57a-ed65069db068"). InnerVolumeSpecName "kube-api-access-czq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.320733 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a6dd260-7a03-4151-a57a-ed65069db068" (UID: "9a6dd260-7a03-4151-a57a-ed65069db068"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.354605 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data" (OuterVolumeSpecName: "config-data") pod "9a6dd260-7a03-4151-a57a-ed65069db068" (UID: "9a6dd260-7a03-4151-a57a-ed65069db068"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.386944 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.386989 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czq67\" (UniqueName: \"kubernetes.io/projected/9a6dd260-7a03-4151-a57a-ed65069db068-kube-api-access-czq67\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.387010 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6dd260-7a03-4151-a57a-ed65069db068-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.753093 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-scqpw" event={"ID":"9a6dd260-7a03-4151-a57a-ed65069db068","Type":"ContainerDied","Data":"1dd4c0e3b9e4bf5af205d0f52109f878e3b5e6dc12d04785c8e8a126ba5fb9b0"} Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.753143 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd4c0e3b9e4bf5af205d0f52109f878e3b5e6dc12d04785c8e8a126ba5fb9b0" Dec 02 08:51:26 crc kubenswrapper[4895]: I1202 08:51:26.753180 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-scqpw" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.043037 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g7gpq"] Dec 02 08:51:27 crc kubenswrapper[4895]: E1202 08:51:27.043443 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6dd260-7a03-4151-a57a-ed65069db068" containerName="keystone-db-sync" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.043461 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6dd260-7a03-4151-a57a-ed65069db068" containerName="keystone-db-sync" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.043641 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6dd260-7a03-4151-a57a-ed65069db068" containerName="keystone-db-sync" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.044304 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.052762 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.052838 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.052762 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.053464 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.055710 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.056906 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.068645 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g7gpq"] Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.072681 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-njttf" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.083115 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104360 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104440 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nddm\" (UniqueName: \"kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104487 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104518 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.104565 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.205638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.205700 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64r4\" (UniqueName: \"kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.205793 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.206165 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.206286 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nddm\" (UniqueName: \"kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207218 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.207853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.212802 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.214041 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.214443 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.216870 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.219339 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.243058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nddm\" (UniqueName: \"kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm\") pod \"keystone-bootstrap-g7gpq\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.309519 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.309583 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.309629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64r4\" (UniqueName: \"kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.309663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.309712 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.310703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.310880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.311547 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.311721 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.331977 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64r4\" (UniqueName: \"kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4\") pod \"dnsmasq-dns-597fd75467-lvrkg\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.368681 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.389015 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.863909 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g7gpq"] Dec 02 08:51:27 crc kubenswrapper[4895]: I1202 08:51:27.912913 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.788174 4895 generic.go:334] "Generic (PLEG): container finished" podID="45adac47-56b1-42f9-82d3-65341c6446a5" containerID="99d316631338e3a7ef1ffcc42362375c1e56430558a3255199b1ceb7ef8eea9d" exitCode=0 Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.789643 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" event={"ID":"45adac47-56b1-42f9-82d3-65341c6446a5","Type":"ContainerDied","Data":"99d316631338e3a7ef1ffcc42362375c1e56430558a3255199b1ceb7ef8eea9d"} Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.789685 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" event={"ID":"45adac47-56b1-42f9-82d3-65341c6446a5","Type":"ContainerStarted","Data":"a307a1c60c2072d6603d2cdc538551b7f83152d2f900699299091cea770107a3"} Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.798543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g7gpq" event={"ID":"14a88421-d642-4535-b965-f0b329908b7e","Type":"ContainerStarted","Data":"39f727fa570af71bfb0738805b753dfa3fc3507325e32440311d039152c4d455"} Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.798603 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g7gpq" event={"ID":"14a88421-d642-4535-b965-f0b329908b7e","Type":"ContainerStarted","Data":"a394fc6089084d53704bfd1c7c44702680f86a85055316d9117aa9cab5493e19"} Dec 02 08:51:28 crc kubenswrapper[4895]: I1202 08:51:28.870508 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g7gpq" podStartSLOduration=1.8704843960000002 podStartE2EDuration="1.870484396s" podCreationTimestamp="2025-12-02 08:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:28.864567922 +0000 UTC m=+5300.035427545" watchObservedRunningTime="2025-12-02 08:51:28.870484396 +0000 UTC m=+5300.041344019" Dec 02 08:51:29 crc kubenswrapper[4895]: I1202 08:51:29.186128 4895 scope.go:117] "RemoveContainer" containerID="314a5bb4aa677fc60fe900ff1f1544794ba3896d8e03f593bec50c68bcb84802" Dec 02 08:51:29 crc kubenswrapper[4895]: I1202 08:51:29.211444 4895 scope.go:117] "RemoveContainer" containerID="d1455e109e3f0574aded0226a1b09f401df9d32f7c0ba525cb55ef448fb81738" Dec 02 08:51:29 crc kubenswrapper[4895]: I1202 08:51:29.811962 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" event={"ID":"45adac47-56b1-42f9-82d3-65341c6446a5","Type":"ContainerStarted","Data":"044ac01be679ca0f730a7d3890c4b5c48e0a04c2d42e95fd625b0922ed03c99e"} Dec 02 08:51:29 crc kubenswrapper[4895]: I1202 08:51:29.850035 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" podStartSLOduration=2.85000039 podStartE2EDuration="2.85000039s" podCreationTimestamp="2025-12-02 08:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:29.833936769 +0000 UTC m=+5301.004796442" watchObservedRunningTime="2025-12-02 08:51:29.85000039 +0000 UTC m=+5301.020860023" Dec 02 08:51:30 crc kubenswrapper[4895]: I1202 08:51:30.823061 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:31 crc kubenswrapper[4895]: I1202 08:51:31.840441 4895 generic.go:334] "Generic (PLEG): container finished" podID="14a88421-d642-4535-b965-f0b329908b7e" containerID="39f727fa570af71bfb0738805b753dfa3fc3507325e32440311d039152c4d455" exitCode=0 Dec 02 08:51:31 crc kubenswrapper[4895]: I1202 08:51:31.840536 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g7gpq" event={"ID":"14a88421-d642-4535-b965-f0b329908b7e","Type":"ContainerDied","Data":"39f727fa570af71bfb0738805b753dfa3fc3507325e32440311d039152c4d455"} Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.303558 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.369773 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.369873 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nddm\" (UniqueName: \"kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.369909 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.370081 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.370219 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.370357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys\") pod \"14a88421-d642-4535-b965-f0b329908b7e\" (UID: \"14a88421-d642-4535-b965-f0b329908b7e\") " Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.380764 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.381065 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.381895 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts" (OuterVolumeSpecName: "scripts") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.381925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm" (OuterVolumeSpecName: "kube-api-access-6nddm") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "kube-api-access-6nddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.410952 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data" (OuterVolumeSpecName: "config-data") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.421228 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a88421-d642-4535-b965-f0b329908b7e" (UID: "14a88421-d642-4535-b965-f0b329908b7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473010 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473049 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473061 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473069 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473080 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nddm\" (UniqueName: \"kubernetes.io/projected/14a88421-d642-4535-b965-f0b329908b7e-kube-api-access-6nddm\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.473091 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14a88421-d642-4535-b965-f0b329908b7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.873681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g7gpq" event={"ID":"14a88421-d642-4535-b965-f0b329908b7e","Type":"ContainerDied","Data":"a394fc6089084d53704bfd1c7c44702680f86a85055316d9117aa9cab5493e19"} Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.874044 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a394fc6089084d53704bfd1c7c44702680f86a85055316d9117aa9cab5493e19" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.873754 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g7gpq" Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.985597 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g7gpq"] Dec 02 08:51:33 crc kubenswrapper[4895]: I1202 08:51:33.992276 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g7gpq"] Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.078186 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4fwjz"] Dec 02 08:51:34 crc kubenswrapper[4895]: E1202 08:51:34.078813 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a88421-d642-4535-b965-f0b329908b7e" containerName="keystone-bootstrap" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.078904 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a88421-d642-4535-b965-f0b329908b7e" containerName="keystone-bootstrap" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.079235 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a88421-d642-4535-b965-f0b329908b7e" containerName="keystone-bootstrap" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.080035 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.083824 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.083949 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.084021 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.084327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-njttf" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.085025 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.104835 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4fwjz"] Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.186978 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.187040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.187075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.187219 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.187248 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.187285 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgw6p\" (UniqueName: \"kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.289003 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.289071 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.289349 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.290357 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.290430 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgw6p\" (UniqueName: \"kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.292486 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.296136 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.296201 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.296253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.297505 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.298667 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.308854 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgw6p\" (UniqueName: \"kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p\") pod \"keystone-bootstrap-4fwjz\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.462488 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:34 crc kubenswrapper[4895]: I1202 08:51:34.925276 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4fwjz"] Dec 02 08:51:35 crc kubenswrapper[4895]: I1202 08:51:35.150663 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a88421-d642-4535-b965-f0b329908b7e" path="/var/lib/kubelet/pods/14a88421-d642-4535-b965-f0b329908b7e/volumes" Dec 02 08:51:35 crc kubenswrapper[4895]: I1202 08:51:35.891887 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fwjz" event={"ID":"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad","Type":"ContainerStarted","Data":"a02563ddf1b97e68acdff8249884b6fec1ebcc0a9ad8d06bfafb51afebc52679"} Dec 02 08:51:35 crc kubenswrapper[4895]: I1202 08:51:35.892362 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fwjz" event={"ID":"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad","Type":"ContainerStarted","Data":"22ee7237e812344c37742db297e5c0a343917945c5627cc71680f44a3b72e539"} Dec 02 08:51:35 crc kubenswrapper[4895]: I1202 08:51:35.917024 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4fwjz" podStartSLOduration=1.917001204 podStartE2EDuration="1.917001204s" podCreationTimestamp="2025-12-02 08:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:35.910937057 +0000 UTC m=+5307.081796710" watchObservedRunningTime="2025-12-02 08:51:35.917001204 +0000 UTC m=+5307.087860837" Dec 02 08:51:36 crc kubenswrapper[4895]: I1202 08:51:36.140469 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:51:36 crc kubenswrapper[4895]: I1202 08:51:36.901724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec"} Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.390718 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.468896 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.469141 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="dnsmasq-dns" containerID="cri-o://a1f45b47ec81a501d51a19f21177dfb8c3b2bc94ddff39d3fe5634b12d509be8" gracePeriod=10 Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.920998 4895 generic.go:334] "Generic (PLEG): container finished" podID="1389de2c-a59b-4963-ab49-824c6df666d1" containerID="a1f45b47ec81a501d51a19f21177dfb8c3b2bc94ddff39d3fe5634b12d509be8" exitCode=0 Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.921382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerDied","Data":"a1f45b47ec81a501d51a19f21177dfb8c3b2bc94ddff39d3fe5634b12d509be8"} Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.925044 4895 generic.go:334] "Generic (PLEG): container finished" podID="5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" containerID="a02563ddf1b97e68acdff8249884b6fec1ebcc0a9ad8d06bfafb51afebc52679" exitCode=0 Dec 02 08:51:37 crc kubenswrapper[4895]: I1202 08:51:37.925074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fwjz" event={"ID":"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad","Type":"ContainerDied","Data":"a02563ddf1b97e68acdff8249884b6fec1ebcc0a9ad8d06bfafb51afebc52679"} Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.046355 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.162692 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb\") pod \"1389de2c-a59b-4963-ab49-824c6df666d1\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.163123 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc\") pod \"1389de2c-a59b-4963-ab49-824c6df666d1\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.163176 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb\") pod \"1389de2c-a59b-4963-ab49-824c6df666d1\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.163224 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config\") pod \"1389de2c-a59b-4963-ab49-824c6df666d1\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.163327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx9fz\" (UniqueName: \"kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz\") pod \"1389de2c-a59b-4963-ab49-824c6df666d1\" (UID: \"1389de2c-a59b-4963-ab49-824c6df666d1\") " Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.172510 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz" (OuterVolumeSpecName: "kube-api-access-rx9fz") pod "1389de2c-a59b-4963-ab49-824c6df666d1" (UID: "1389de2c-a59b-4963-ab49-824c6df666d1"). InnerVolumeSpecName "kube-api-access-rx9fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.206541 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config" (OuterVolumeSpecName: "config") pod "1389de2c-a59b-4963-ab49-824c6df666d1" (UID: "1389de2c-a59b-4963-ab49-824c6df666d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.206713 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1389de2c-a59b-4963-ab49-824c6df666d1" (UID: "1389de2c-a59b-4963-ab49-824c6df666d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.207337 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1389de2c-a59b-4963-ab49-824c6df666d1" (UID: "1389de2c-a59b-4963-ab49-824c6df666d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.209674 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1389de2c-a59b-4963-ab49-824c6df666d1" (UID: "1389de2c-a59b-4963-ab49-824c6df666d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.265611 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx9fz\" (UniqueName: \"kubernetes.io/projected/1389de2c-a59b-4963-ab49-824c6df666d1-kube-api-access-rx9fz\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.265640 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.265651 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.265659 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.265668 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1389de2c-a59b-4963-ab49-824c6df666d1-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.941434 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" event={"ID":"1389de2c-a59b-4963-ab49-824c6df666d1","Type":"ContainerDied","Data":"bcf6c5e7eaa6f4244e26d6920bcd5cbfba207f1bbeb9eaec15c2faa4ff0022ab"} Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.942056 4895 scope.go:117] "RemoveContainer" containerID="a1f45b47ec81a501d51a19f21177dfb8c3b2bc94ddff39d3fe5634b12d509be8" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.941605 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68fdf4b965-f6gm5" Dec 02 08:51:38 crc kubenswrapper[4895]: I1202 08:51:38.989302 4895 scope.go:117] "RemoveContainer" containerID="9c77d4c808e5f41a0e36134b657ad33c15d6f6445daaaac293e1f103d79d10f4" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.014350 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.024405 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68fdf4b965-f6gm5"] Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.159331 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" path="/var/lib/kubelet/pods/1389de2c-a59b-4963-ab49-824c6df666d1/volumes" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.342630 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.487792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.488311 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgw6p\" (UniqueName: \"kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.488363 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.488426 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.488532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.488610 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys\") pod \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\" (UID: \"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad\") " Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.494119 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.494249 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.494313 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts" (OuterVolumeSpecName: "scripts") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.495625 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p" (OuterVolumeSpecName: "kube-api-access-kgw6p") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "kube-api-access-kgw6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.510128 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data" (OuterVolumeSpecName: "config-data") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.514262 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" (UID: "5f55786e-cf8a-4ce7-affc-1952b6a2e1ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590881 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgw6p\" (UniqueName: \"kubernetes.io/projected/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-kube-api-access-kgw6p\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590920 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590929 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590944 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590955 4895 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.590963 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.956866 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4fwjz" event={"ID":"5f55786e-cf8a-4ce7-affc-1952b6a2e1ad","Type":"ContainerDied","Data":"22ee7237e812344c37742db297e5c0a343917945c5627cc71680f44a3b72e539"} Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.956948 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22ee7237e812344c37742db297e5c0a343917945c5627cc71680f44a3b72e539" Dec 02 08:51:39 crc kubenswrapper[4895]: I1202 08:51:39.956982 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4fwjz" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.054383 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7db5ccf6f4-vpr49"] Dec 02 08:51:40 crc kubenswrapper[4895]: E1202 08:51:40.055129 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="dnsmasq-dns" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.055162 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="dnsmasq-dns" Dec 02 08:51:40 crc kubenswrapper[4895]: E1202 08:51:40.055186 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" containerName="keystone-bootstrap" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.055199 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" containerName="keystone-bootstrap" Dec 02 08:51:40 crc kubenswrapper[4895]: E1202 08:51:40.055267 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="init" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.055281 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="init" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.055580 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" containerName="keystone-bootstrap" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.055610 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1389de2c-a59b-4963-ab49-824c6df666d1" containerName="dnsmasq-dns" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.057281 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.060234 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.061155 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.061157 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-njttf" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.071477 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.098842 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db5ccf6f4-vpr49"] Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.203621 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-config-data\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.203727 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtxm\" (UniqueName: \"kubernetes.io/projected/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-kube-api-access-frtxm\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.203828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-scripts\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.205099 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-combined-ca-bundle\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.205332 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-credential-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.205398 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-fernet-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-config-data\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307730 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtxm\" (UniqueName: \"kubernetes.io/projected/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-kube-api-access-frtxm\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307851 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-scripts\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-combined-ca-bundle\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307923 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-credential-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.307945 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-fernet-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.323457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-combined-ca-bundle\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.324214 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-scripts\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.324594 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-credential-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.325105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-fernet-keys\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.325150 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-config-data\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.333112 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtxm\" (UniqueName: \"kubernetes.io/projected/9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c-kube-api-access-frtxm\") pod \"keystone-7db5ccf6f4-vpr49\" (UID: \"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c\") " pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.386663 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.633245 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db5ccf6f4-vpr49"] Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.966006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db5ccf6f4-vpr49" event={"ID":"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c","Type":"ContainerStarted","Data":"ad57555b76598df1ae99e6b8e62e41417562be78c10c2b3ab0dd480ea66cf050"} Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.966432 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.966448 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db5ccf6f4-vpr49" event={"ID":"9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c","Type":"ContainerStarted","Data":"3f20e55564a5e1a2e60f02374e6153743bd465802a0829bf585a55c29bfea035"} Dec 02 08:51:40 crc kubenswrapper[4895]: I1202 08:51:40.983702 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7db5ccf6f4-vpr49" podStartSLOduration=0.983685291 podStartE2EDuration="983.685291ms" podCreationTimestamp="2025-12-02 08:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:51:40.982678759 +0000 UTC m=+5312.153538372" watchObservedRunningTime="2025-12-02 08:51:40.983685291 +0000 UTC m=+5312.154544904" Dec 02 08:52:11 crc kubenswrapper[4895]: I1202 08:52:11.985978 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7db5ccf6f4-vpr49" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.319520 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.321933 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.324410 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.324479 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.324624 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lcnsh" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.335130 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.348875 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: E1202 08:52:16.351719 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qwkh8 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-qwkh8 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="6b2495e2-2e3f-422d-b757-421c5c4fc241" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.360944 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.389254 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.390661 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.401025 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.523195 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.523437 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.523513 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gtm\" (UniqueName: \"kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.625608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.625671 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.625696 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gtm\" (UniqueName: \"kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.627916 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.634018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.649559 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gtm\" (UniqueName: \"kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm\") pod \"openstackclient\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " pod="openstack/openstackclient" Dec 02 08:52:16 crc kubenswrapper[4895]: I1202 08:52:16.712512 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.152704 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2495e2-2e3f-422d-b757-421c5c4fc241" path="/var/lib/kubelet/pods/6b2495e2-2e3f-422d-b757-421c5c4fc241/volumes" Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.180597 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.373311 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.373341 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cfe1444d-9391-4b5b-a770-14e55da2a63d","Type":"ContainerStarted","Data":"f20b52de7232adf766d00e3e7d74feff75df26b00c31c85e3d851ff09c011c1f"} Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.373425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cfe1444d-9391-4b5b-a770-14e55da2a63d","Type":"ContainerStarted","Data":"28c84aab3962bac78bff9b13116b288a052dfafa60af3a9b8f01633d011931ba"} Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.384673 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.396253 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.3962307520000001 podStartE2EDuration="1.396230752s" podCreationTimestamp="2025-12-02 08:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:52:17.390561546 +0000 UTC m=+5348.561421169" watchObservedRunningTime="2025-12-02 08:52:17.396230752 +0000 UTC m=+5348.567090365" Dec 02 08:52:17 crc kubenswrapper[4895]: I1202 08:52:17.399502 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b2495e2-2e3f-422d-b757-421c5c4fc241" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" Dec 02 08:52:18 crc kubenswrapper[4895]: I1202 08:52:18.381616 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:52:18 crc kubenswrapper[4895]: I1202 08:52:18.405128 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b2495e2-2e3f-422d-b757-421c5c4fc241" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" Dec 02 08:52:29 crc kubenswrapper[4895]: I1202 08:52:29.408888 4895 scope.go:117] "RemoveContainer" containerID="f878460e0b4a0877664a27db856ce73d083ed35fe3b1b80f06c55b474f033f48" Dec 02 08:52:29 crc kubenswrapper[4895]: I1202 08:52:29.439081 4895 scope.go:117] "RemoveContainer" containerID="e6945bf8ec19b3f33bfbbe32a1790d6f153dcdd29fea0d52bbb2ac72701756c3" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.183708 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.190140 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.203568 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.321548 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9vn\" (UniqueName: \"kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.321648 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.321748 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.423078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.423542 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.423612 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js9vn\" (UniqueName: \"kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.423901 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.424079 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.443583 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9vn\" (UniqueName: \"kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn\") pod \"certified-operators-p5572\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:32 crc kubenswrapper[4895]: I1202 08:52:32.546259 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:33 crc kubenswrapper[4895]: I1202 08:52:33.075057 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:33 crc kubenswrapper[4895]: W1202 08:52:33.084941 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75eed9f8_e31e_4f7c_bc83_3384d1b874bd.slice/crio-dfe647f09b802ba4add339b25152b1e81102fca23623078152705fd57d3022df WatchSource:0}: Error finding container dfe647f09b802ba4add339b25152b1e81102fca23623078152705fd57d3022df: Status 404 returned error can't find the container with id dfe647f09b802ba4add339b25152b1e81102fca23623078152705fd57d3022df Dec 02 08:52:34 crc kubenswrapper[4895]: I1202 08:52:34.175299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerStarted","Data":"dfe647f09b802ba4add339b25152b1e81102fca23623078152705fd57d3022df"} Dec 02 08:52:34 crc kubenswrapper[4895]: E1202 08:52:34.350901 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75eed9f8_e31e_4f7c_bc83_3384d1b874bd.slice/crio-conmon-08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75eed9f8_e31e_4f7c_bc83_3384d1b874bd.slice/crio-08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:52:35 crc kubenswrapper[4895]: I1202 08:52:35.191916 4895 generic.go:334] "Generic (PLEG): container finished" podID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerID="08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46" exitCode=0 Dec 02 08:52:35 crc kubenswrapper[4895]: I1202 08:52:35.192051 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerDied","Data":"08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46"} Dec 02 08:52:35 crc kubenswrapper[4895]: I1202 08:52:35.196215 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:52:36 crc kubenswrapper[4895]: I1202 08:52:36.205560 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerStarted","Data":"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6"} Dec 02 08:52:37 crc kubenswrapper[4895]: I1202 08:52:37.215586 4895 generic.go:334] "Generic (PLEG): container finished" podID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerID="c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6" exitCode=0 Dec 02 08:52:37 crc kubenswrapper[4895]: I1202 08:52:37.215654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerDied","Data":"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6"} Dec 02 08:52:38 crc kubenswrapper[4895]: I1202 08:52:38.225606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerStarted","Data":"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d"} Dec 02 08:52:38 crc kubenswrapper[4895]: I1202 08:52:38.248784 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5572" podStartSLOduration=3.555619845 podStartE2EDuration="6.248759329s" podCreationTimestamp="2025-12-02 08:52:32 +0000 UTC" firstStartedPulling="2025-12-02 08:52:35.195872072 +0000 UTC m=+5366.366731685" lastFinishedPulling="2025-12-02 08:52:37.889011546 +0000 UTC m=+5369.059871169" observedRunningTime="2025-12-02 08:52:38.245108945 +0000 UTC m=+5369.415968558" watchObservedRunningTime="2025-12-02 08:52:38.248759329 +0000 UTC m=+5369.419618962" Dec 02 08:52:42 crc kubenswrapper[4895]: I1202 08:52:42.546885 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:42 crc kubenswrapper[4895]: I1202 08:52:42.547461 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:42 crc kubenswrapper[4895]: I1202 08:52:42.595364 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:43 crc kubenswrapper[4895]: I1202 08:52:43.325042 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:43 crc kubenswrapper[4895]: I1202 08:52:43.394067 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.302338 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5572" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="registry-server" containerID="cri-o://97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d" gracePeriod=2 Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.743680 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.865339 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content\") pod \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.865819 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities\") pod \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.865968 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js9vn\" (UniqueName: \"kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn\") pod \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\" (UID: \"75eed9f8-e31e-4f7c-bc83-3384d1b874bd\") " Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.866601 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities" (OuterVolumeSpecName: "utilities") pod "75eed9f8-e31e-4f7c-bc83-3384d1b874bd" (UID: "75eed9f8-e31e-4f7c-bc83-3384d1b874bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.873730 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn" (OuterVolumeSpecName: "kube-api-access-js9vn") pod "75eed9f8-e31e-4f7c-bc83-3384d1b874bd" (UID: "75eed9f8-e31e-4f7c-bc83-3384d1b874bd"). InnerVolumeSpecName "kube-api-access-js9vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.922445 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75eed9f8-e31e-4f7c-bc83-3384d1b874bd" (UID: "75eed9f8-e31e-4f7c-bc83-3384d1b874bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.968198 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.968812 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js9vn\" (UniqueName: \"kubernetes.io/projected/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-kube-api-access-js9vn\") on node \"crc\" DevicePath \"\"" Dec 02 08:52:45 crc kubenswrapper[4895]: I1202 08:52:45.968890 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75eed9f8-e31e-4f7c-bc83-3384d1b874bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.315935 4895 generic.go:334] "Generic (PLEG): container finished" podID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerID="97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d" exitCode=0 Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.315990 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerDied","Data":"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d"} Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.316028 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5572" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.316054 4895 scope.go:117] "RemoveContainer" containerID="97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.316038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5572" event={"ID":"75eed9f8-e31e-4f7c-bc83-3384d1b874bd","Type":"ContainerDied","Data":"dfe647f09b802ba4add339b25152b1e81102fca23623078152705fd57d3022df"} Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.338526 4895 scope.go:117] "RemoveContainer" containerID="c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.355709 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.366726 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5572"] Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.382094 4895 scope.go:117] "RemoveContainer" containerID="08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.414675 4895 scope.go:117] "RemoveContainer" containerID="97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d" Dec 02 08:52:46 crc kubenswrapper[4895]: E1202 08:52:46.415422 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d\": container with ID starting with 97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d not found: ID does not exist" containerID="97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.415463 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d"} err="failed to get container status \"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d\": rpc error: code = NotFound desc = could not find container \"97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d\": container with ID starting with 97fef6de437e20cf0ea6669d682c19cb3454ac478c99cad77e4cf5f3320ffc8d not found: ID does not exist" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.415492 4895 scope.go:117] "RemoveContainer" containerID="c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6" Dec 02 08:52:46 crc kubenswrapper[4895]: E1202 08:52:46.416525 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6\": container with ID starting with c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6 not found: ID does not exist" containerID="c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.416575 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6"} err="failed to get container status \"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6\": rpc error: code = NotFound desc = could not find container \"c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6\": container with ID starting with c04618868fa033c4c220f425bb82b1e8a6c770a16ad7e9e00a8acebcc025d8e6 not found: ID does not exist" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.416650 4895 scope.go:117] "RemoveContainer" containerID="08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46" Dec 02 08:52:46 crc kubenswrapper[4895]: E1202 08:52:46.417866 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46\": container with ID starting with 08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46 not found: ID does not exist" containerID="08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46" Dec 02 08:52:46 crc kubenswrapper[4895]: I1202 08:52:46.418333 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46"} err="failed to get container status \"08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46\": rpc error: code = NotFound desc = could not find container \"08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46\": container with ID starting with 08879a19c3a3c6cc6e95d86ed6b9dc26fff7f52b55604248cd3ca7127a8ffd46 not found: ID does not exist" Dec 02 08:52:47 crc kubenswrapper[4895]: I1202 08:52:47.156700 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" path="/var/lib/kubelet/pods/75eed9f8-e31e-4f7c-bc83-3384d1b874bd/volumes" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.893770 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:07 crc kubenswrapper[4895]: E1202 08:53:07.895278 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="extract-content" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.895302 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="extract-content" Dec 02 08:53:07 crc kubenswrapper[4895]: E1202 08:53:07.895345 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="extract-utilities" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.895356 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="extract-utilities" Dec 02 08:53:07 crc kubenswrapper[4895]: E1202 08:53:07.895377 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="registry-server" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.895385 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="registry-server" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.895614 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="75eed9f8-e31e-4f7c-bc83-3384d1b874bd" containerName="registry-server" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.897790 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.913942 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.940632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjn7s\" (UniqueName: \"kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.940748 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:07 crc kubenswrapper[4895]: I1202 08:53:07.940791 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.042192 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjn7s\" (UniqueName: \"kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.042322 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.042371 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.043037 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.043142 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.069112 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjn7s\" (UniqueName: \"kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s\") pod \"redhat-marketplace-h4zfb\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.245012 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:08 crc kubenswrapper[4895]: I1202 08:53:08.759962 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:09 crc kubenswrapper[4895]: I1202 08:53:09.546316 4895 generic.go:334] "Generic (PLEG): container finished" podID="32090237-b356-4a1b-b728-4d20f018294d" containerID="71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354" exitCode=0 Dec 02 08:53:09 crc kubenswrapper[4895]: I1202 08:53:09.546706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerDied","Data":"71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354"} Dec 02 08:53:09 crc kubenswrapper[4895]: I1202 08:53:09.546767 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerStarted","Data":"955b1e46f5a44cc590e0324da162e4caefe0e71e9719737066b1e80cdd2d75a9"} Dec 02 08:53:11 crc kubenswrapper[4895]: I1202 08:53:11.567449 4895 generic.go:334] "Generic (PLEG): container finished" podID="32090237-b356-4a1b-b728-4d20f018294d" containerID="5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a" exitCode=0 Dec 02 08:53:11 crc kubenswrapper[4895]: I1202 08:53:11.567574 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerDied","Data":"5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a"} Dec 02 08:53:12 crc kubenswrapper[4895]: I1202 08:53:12.622199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerStarted","Data":"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749"} Dec 02 08:53:12 crc kubenswrapper[4895]: I1202 08:53:12.647799 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h4zfb" podStartSLOduration=3.17118136 podStartE2EDuration="5.647715536s" podCreationTimestamp="2025-12-02 08:53:07 +0000 UTC" firstStartedPulling="2025-12-02 08:53:09.549308135 +0000 UTC m=+5400.720167758" lastFinishedPulling="2025-12-02 08:53:12.025842321 +0000 UTC m=+5403.196701934" observedRunningTime="2025-12-02 08:53:12.641920877 +0000 UTC m=+5403.812780490" watchObservedRunningTime="2025-12-02 08:53:12.647715536 +0000 UTC m=+5403.818575139" Dec 02 08:53:18 crc kubenswrapper[4895]: I1202 08:53:18.245360 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:18 crc kubenswrapper[4895]: I1202 08:53:18.247529 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:18 crc kubenswrapper[4895]: I1202 08:53:18.307432 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:18 crc kubenswrapper[4895]: I1202 08:53:18.727840 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:18 crc kubenswrapper[4895]: I1202 08:53:18.782756 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:20 crc kubenswrapper[4895]: I1202 08:53:20.690136 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h4zfb" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="registry-server" containerID="cri-o://0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749" gracePeriod=2 Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.235733 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.337103 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjn7s\" (UniqueName: \"kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s\") pod \"32090237-b356-4a1b-b728-4d20f018294d\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.337833 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities\") pod \"32090237-b356-4a1b-b728-4d20f018294d\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.338206 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content\") pod \"32090237-b356-4a1b-b728-4d20f018294d\" (UID: \"32090237-b356-4a1b-b728-4d20f018294d\") " Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.339634 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities" (OuterVolumeSpecName: "utilities") pod "32090237-b356-4a1b-b728-4d20f018294d" (UID: "32090237-b356-4a1b-b728-4d20f018294d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.344013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s" (OuterVolumeSpecName: "kube-api-access-jjn7s") pod "32090237-b356-4a1b-b728-4d20f018294d" (UID: "32090237-b356-4a1b-b728-4d20f018294d"). InnerVolumeSpecName "kube-api-access-jjn7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.359332 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32090237-b356-4a1b-b728-4d20f018294d" (UID: "32090237-b356-4a1b-b728-4d20f018294d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.440315 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjn7s\" (UniqueName: \"kubernetes.io/projected/32090237-b356-4a1b-b728-4d20f018294d-kube-api-access-jjn7s\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.440363 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.440373 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32090237-b356-4a1b-b728-4d20f018294d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.702630 4895 generic.go:334] "Generic (PLEG): container finished" podID="32090237-b356-4a1b-b728-4d20f018294d" containerID="0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749" exitCode=0 Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.702713 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerDied","Data":"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749"} Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.702781 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zfb" event={"ID":"32090237-b356-4a1b-b728-4d20f018294d","Type":"ContainerDied","Data":"955b1e46f5a44cc590e0324da162e4caefe0e71e9719737066b1e80cdd2d75a9"} Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.702812 4895 scope.go:117] "RemoveContainer" containerID="0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.703057 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zfb" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.737970 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.743491 4895 scope.go:117] "RemoveContainer" containerID="5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.751388 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zfb"] Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.769831 4895 scope.go:117] "RemoveContainer" containerID="71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.812641 4895 scope.go:117] "RemoveContainer" containerID="0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749" Dec 02 08:53:21 crc kubenswrapper[4895]: E1202 08:53:21.813572 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749\": container with ID starting with 0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749 not found: ID does not exist" containerID="0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.813609 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749"} err="failed to get container status \"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749\": rpc error: code = NotFound desc = could not find container \"0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749\": container with ID starting with 0de0840015dd3565d93b90c1b116b0a19302033b22d792394c313aaf3606f749 not found: ID does not exist" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.813634 4895 scope.go:117] "RemoveContainer" containerID="5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a" Dec 02 08:53:21 crc kubenswrapper[4895]: E1202 08:53:21.814130 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a\": container with ID starting with 5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a not found: ID does not exist" containerID="5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.814193 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a"} err="failed to get container status \"5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a\": rpc error: code = NotFound desc = could not find container \"5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a\": container with ID starting with 5f81751daf5f7361f9129cf72dfc27e6a05d8d8eaaf0385389a1576eaddb352a not found: ID does not exist" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.814225 4895 scope.go:117] "RemoveContainer" containerID="71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354" Dec 02 08:53:21 crc kubenswrapper[4895]: E1202 08:53:21.814619 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354\": container with ID starting with 71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354 not found: ID does not exist" containerID="71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354" Dec 02 08:53:21 crc kubenswrapper[4895]: I1202 08:53:21.814646 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354"} err="failed to get container status \"71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354\": rpc error: code = NotFound desc = could not find container \"71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354\": container with ID starting with 71ef8690b1f0419bc5e87d5b6bb318d6fd484df7a212b32cb26da49c3788f354 not found: ID does not exist" Dec 02 08:53:23 crc kubenswrapper[4895]: I1202 08:53:23.167929 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32090237-b356-4a1b-b728-4d20f018294d" path="/var/lib/kubelet/pods/32090237-b356-4a1b-b728-4d20f018294d/volumes" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.345105 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-snb7q"] Dec 02 08:53:56 crc kubenswrapper[4895]: E1202 08:53:56.346265 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="extract-content" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.346282 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="extract-content" Dec 02 08:53:56 crc kubenswrapper[4895]: E1202 08:53:56.346317 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="registry-server" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.346324 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="registry-server" Dec 02 08:53:56 crc kubenswrapper[4895]: E1202 08:53:56.346335 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="extract-utilities" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.346341 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="extract-utilities" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.346508 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="32090237-b356-4a1b-b728-4d20f018294d" containerName="registry-server" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.347312 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.359834 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-345f-account-create-update-qpz2w"] Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.361547 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.365567 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.378251 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-snb7q"] Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.386075 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-345f-account-create-update-qpz2w"] Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.432649 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.432915 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ctcc\" (UniqueName: \"kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.433246 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49qd\" (UniqueName: \"kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.433406 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.535183 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49qd\" (UniqueName: \"kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.535689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.535782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.535864 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ctcc\" (UniqueName: \"kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.536856 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.536872 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.558670 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ctcc\" (UniqueName: \"kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc\") pod \"barbican-db-create-snb7q\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.563012 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49qd\" (UniqueName: \"kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd\") pod \"barbican-345f-account-create-update-qpz2w\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.681212 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:56 crc kubenswrapper[4895]: I1202 08:53:56.690465 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:57 crc kubenswrapper[4895]: I1202 08:53:57.153319 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-345f-account-create-update-qpz2w"] Dec 02 08:53:57 crc kubenswrapper[4895]: I1202 08:53:57.198196 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-snb7q"] Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.055816 4895 generic.go:334] "Generic (PLEG): container finished" podID="c20cf098-8a77-4677-959a-9264e799bb6a" containerID="f95b461efe7c0e15bbf698e725cf1b5d926c2c7e79046120926734bf8421fe6b" exitCode=0 Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.055920 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-345f-account-create-update-qpz2w" event={"ID":"c20cf098-8a77-4677-959a-9264e799bb6a","Type":"ContainerDied","Data":"f95b461efe7c0e15bbf698e725cf1b5d926c2c7e79046120926734bf8421fe6b"} Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.056258 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-345f-account-create-update-qpz2w" event={"ID":"c20cf098-8a77-4677-959a-9264e799bb6a","Type":"ContainerStarted","Data":"c52a753781fe43f1a55a318cd28498187b9932aabfb8f2fa2ebe3005f782f9ce"} Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.059045 4895 generic.go:334] "Generic (PLEG): container finished" podID="5ec9f4cd-d497-4bb1-aea0-bb28977e971d" containerID="3ee877cee51e74863abd814c7c20c27274c71190aa6fc06006ba34827c0c6ec2" exitCode=0 Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.059103 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-snb7q" event={"ID":"5ec9f4cd-d497-4bb1-aea0-bb28977e971d","Type":"ContainerDied","Data":"3ee877cee51e74863abd814c7c20c27274c71190aa6fc06006ba34827c0c6ec2"} Dec 02 08:53:58 crc kubenswrapper[4895]: I1202 08:53:58.059127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-snb7q" event={"ID":"5ec9f4cd-d497-4bb1-aea0-bb28977e971d","Type":"ContainerStarted","Data":"4b3efc29fbf0125d60b40427aecad2c214156a93ba5a7f2bdca5d3e3e94c6202"} Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.485086 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.492290 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts\") pod \"c20cf098-8a77-4677-959a-9264e799bb6a\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.492512 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49qd\" (UniqueName: \"kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd\") pod \"c20cf098-8a77-4677-959a-9264e799bb6a\" (UID: \"c20cf098-8a77-4677-959a-9264e799bb6a\") " Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.493046 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c20cf098-8a77-4677-959a-9264e799bb6a" (UID: "c20cf098-8a77-4677-959a-9264e799bb6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.493452 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-snb7q" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.502184 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd" (OuterVolumeSpecName: "kube-api-access-k49qd") pod "c20cf098-8a77-4677-959a-9264e799bb6a" (UID: "c20cf098-8a77-4677-959a-9264e799bb6a"). InnerVolumeSpecName "kube-api-access-k49qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.594367 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts\") pod \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.594503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ctcc\" (UniqueName: \"kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc\") pod \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\" (UID: \"5ec9f4cd-d497-4bb1-aea0-bb28977e971d\") " Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.595091 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49qd\" (UniqueName: \"kubernetes.io/projected/c20cf098-8a77-4677-959a-9264e799bb6a-kube-api-access-k49qd\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.595119 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c20cf098-8a77-4677-959a-9264e799bb6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.596155 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ec9f4cd-d497-4bb1-aea0-bb28977e971d" (UID: "5ec9f4cd-d497-4bb1-aea0-bb28977e971d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.598220 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc" (OuterVolumeSpecName: "kube-api-access-9ctcc") pod "5ec9f4cd-d497-4bb1-aea0-bb28977e971d" (UID: "5ec9f4cd-d497-4bb1-aea0-bb28977e971d"). InnerVolumeSpecName "kube-api-access-9ctcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.697347 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:59 crc kubenswrapper[4895]: I1202 08:53:59.697682 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ctcc\" (UniqueName: \"kubernetes.io/projected/5ec9f4cd-d497-4bb1-aea0-bb28977e971d-kube-api-access-9ctcc\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.086910 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-345f-account-create-update-qpz2w" event={"ID":"c20cf098-8a77-4677-959a-9264e799bb6a","Type":"ContainerDied","Data":"c52a753781fe43f1a55a318cd28498187b9932aabfb8f2fa2ebe3005f782f9ce"} Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.087277 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c52a753781fe43f1a55a318cd28498187b9932aabfb8f2fa2ebe3005f782f9ce" Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.086967 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-345f-account-create-update-qpz2w" Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.090010 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-snb7q" event={"ID":"5ec9f4cd-d497-4bb1-aea0-bb28977e971d","Type":"ContainerDied","Data":"4b3efc29fbf0125d60b40427aecad2c214156a93ba5a7f2bdca5d3e3e94c6202"} Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.090030 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-snb7q" Dec 02 08:54:00 crc kubenswrapper[4895]: I1202 08:54:00.090037 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b3efc29fbf0125d60b40427aecad2c214156a93ba5a7f2bdca5d3e3e94c6202" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.620725 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6j44z"] Dec 02 08:54:01 crc kubenswrapper[4895]: E1202 08:54:01.621256 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec9f4cd-d497-4bb1-aea0-bb28977e971d" containerName="mariadb-database-create" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.621273 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec9f4cd-d497-4bb1-aea0-bb28977e971d" containerName="mariadb-database-create" Dec 02 08:54:01 crc kubenswrapper[4895]: E1202 08:54:01.621314 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20cf098-8a77-4677-959a-9264e799bb6a" containerName="mariadb-account-create-update" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.621321 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20cf098-8a77-4677-959a-9264e799bb6a" containerName="mariadb-account-create-update" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.621494 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20cf098-8a77-4677-959a-9264e799bb6a" containerName="mariadb-account-create-update" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.621516 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec9f4cd-d497-4bb1-aea0-bb28977e971d" containerName="mariadb-database-create" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.622334 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.625097 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2k6k" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.625153 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.627447 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.627675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnlwz\" (UniqueName: \"kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.627895 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.632770 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6j44z"] Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.730226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.730379 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.730410 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnlwz\" (UniqueName: \"kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.738664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.742461 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.750523 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnlwz\" (UniqueName: \"kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz\") pod \"barbican-db-sync-6j44z\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:01 crc kubenswrapper[4895]: I1202 08:54:01.951477 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:02 crc kubenswrapper[4895]: I1202 08:54:02.206304 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6j44z"] Dec 02 08:54:02 crc kubenswrapper[4895]: W1202 08:54:02.210426 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11de1ef0_4dea_4745_9d17_8fcc2a89f38c.slice/crio-3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767 WatchSource:0}: Error finding container 3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767: Status 404 returned error can't find the container with id 3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767 Dec 02 08:54:03 crc kubenswrapper[4895]: I1202 08:54:03.119468 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6j44z" event={"ID":"11de1ef0-4dea-4745-9d17-8fcc2a89f38c","Type":"ContainerStarted","Data":"811d80e26c656246a54363e16a23b46646e3fff64dd86dcd033c54b8f9248b29"} Dec 02 08:54:03 crc kubenswrapper[4895]: I1202 08:54:03.119517 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6j44z" event={"ID":"11de1ef0-4dea-4745-9d17-8fcc2a89f38c","Type":"ContainerStarted","Data":"3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767"} Dec 02 08:54:03 crc kubenswrapper[4895]: I1202 08:54:03.137930 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6j44z" podStartSLOduration=2.137912117 podStartE2EDuration="2.137912117s" podCreationTimestamp="2025-12-02 08:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:03.137160284 +0000 UTC m=+5454.308019907" watchObservedRunningTime="2025-12-02 08:54:03.137912117 +0000 UTC m=+5454.308771730" Dec 02 08:54:04 crc kubenswrapper[4895]: I1202 08:54:04.129674 4895 generic.go:334] "Generic (PLEG): container finished" podID="11de1ef0-4dea-4745-9d17-8fcc2a89f38c" containerID="811d80e26c656246a54363e16a23b46646e3fff64dd86dcd033c54b8f9248b29" exitCode=0 Dec 02 08:54:04 crc kubenswrapper[4895]: I1202 08:54:04.129808 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6j44z" event={"ID":"11de1ef0-4dea-4745-9d17-8fcc2a89f38c","Type":"ContainerDied","Data":"811d80e26c656246a54363e16a23b46646e3fff64dd86dcd033c54b8f9248b29"} Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.430220 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.473612 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.473678 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.502792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data\") pod \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.502871 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle\") pod \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.502919 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnlwz\" (UniqueName: \"kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz\") pod \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\" (UID: \"11de1ef0-4dea-4745-9d17-8fcc2a89f38c\") " Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.508891 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "11de1ef0-4dea-4745-9d17-8fcc2a89f38c" (UID: "11de1ef0-4dea-4745-9d17-8fcc2a89f38c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.508894 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz" (OuterVolumeSpecName: "kube-api-access-wnlwz") pod "11de1ef0-4dea-4745-9d17-8fcc2a89f38c" (UID: "11de1ef0-4dea-4745-9d17-8fcc2a89f38c"). InnerVolumeSpecName "kube-api-access-wnlwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.534433 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11de1ef0-4dea-4745-9d17-8fcc2a89f38c" (UID: "11de1ef0-4dea-4745-9d17-8fcc2a89f38c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.604866 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.605257 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:05 crc kubenswrapper[4895]: I1202 08:54:05.605273 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnlwz\" (UniqueName: \"kubernetes.io/projected/11de1ef0-4dea-4745-9d17-8fcc2a89f38c-kube-api-access-wnlwz\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.147586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6j44z" event={"ID":"11de1ef0-4dea-4745-9d17-8fcc2a89f38c","Type":"ContainerDied","Data":"3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767"} Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.147624 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f9b7984d11cecf8dd99954be8b232fb7e5400e78a5495c09963891ed9a16767" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.147675 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6j44z" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.395785 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cfc66c54c-2fd8f"] Dec 02 08:54:06 crc kubenswrapper[4895]: E1202 08:54:06.396191 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11de1ef0-4dea-4745-9d17-8fcc2a89f38c" containerName="barbican-db-sync" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.396204 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="11de1ef0-4dea-4745-9d17-8fcc2a89f38c" containerName="barbican-db-sync" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.396366 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="11de1ef0-4dea-4745-9d17-8fcc2a89f38c" containerName="barbican-db-sync" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.397288 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.403719 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q2k6k" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.405833 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.420344 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.421361 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a651bae-3fe4-4805-8db1-ef32665084e8-logs\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.421510 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-combined-ca-bundle\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.421591 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mj9l\" (UniqueName: \"kubernetes.io/projected/3a651bae-3fe4-4805-8db1-ef32665084e8-kube-api-access-5mj9l\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.421618 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data-custom\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.421724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.437697 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cfc66c54c-2fd8f"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.475947 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d7f6798b6-7qknd"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.478802 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.493472 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.524701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mj9l\" (UniqueName: \"kubernetes.io/projected/3a651bae-3fe4-4805-8db1-ef32665084e8-kube-api-access-5mj9l\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.524789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data-custom\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.524878 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.525241 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d7f6798b6-7qknd"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.525315 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a651bae-3fe4-4805-8db1-ef32665084e8-logs\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.525354 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-combined-ca-bundle\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.526066 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a651bae-3fe4-4805-8db1-ef32665084e8-logs\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.544172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.544172 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-config-data-custom\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.547296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a651bae-3fe4-4805-8db1-ef32665084e8-combined-ca-bundle\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.561601 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mj9l\" (UniqueName: \"kubernetes.io/projected/3a651bae-3fe4-4805-8db1-ef32665084e8-kube-api-access-5mj9l\") pod \"barbican-worker-7cfc66c54c-2fd8f\" (UID: \"3a651bae-3fe4-4805-8db1-ef32665084e8\") " pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.576932 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.578525 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.593094 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.615449 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-74489445b6-mqcrt"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.617806 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.620537 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.628447 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data-custom\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.628512 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.628582 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.628620 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.633906 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634184 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-combined-ca-bundle\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634224 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4wl\" (UniqueName: \"kubernetes.io/projected/04b41b51-5490-42b8-9f37-837c9c8a3c2d-kube-api-access-6n4wl\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634257 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b41b51-5490-42b8-9f37-837c9c8a3c2d-logs\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634341 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data-custom\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634410 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634459 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftj4\" (UniqueName: \"kubernetes.io/projected/1b097c65-8c8e-435d-8f97-f717edd603a5-kube-api-access-fftj4\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634526 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r955k\" (UniqueName: \"kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.634764 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b097c65-8c8e-435d-8f97-f717edd603a5-logs\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.636153 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74489445b6-mqcrt"] Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.725263 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.751878 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r955k\" (UniqueName: \"kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.753278 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b097c65-8c8e-435d-8f97-f717edd603a5-logs\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.753815 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b097c65-8c8e-435d-8f97-f717edd603a5-logs\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.753957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data-custom\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.753993 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754061 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754107 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754212 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754245 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-combined-ca-bundle\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754265 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4wl\" (UniqueName: \"kubernetes.io/projected/04b41b51-5490-42b8-9f37-837c9c8a3c2d-kube-api-access-6n4wl\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754283 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b41b51-5490-42b8-9f37-837c9c8a3c2d-logs\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754337 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data-custom\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754380 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.754410 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fftj4\" (UniqueName: \"kubernetes.io/projected/1b097c65-8c8e-435d-8f97-f717edd603a5-kube-api-access-fftj4\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.755181 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b41b51-5490-42b8-9f37-837c9c8a3c2d-logs\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.756152 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.757409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.757638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.758151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.762253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.763002 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.764348 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-combined-ca-bundle\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.764420 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data-custom\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.765629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b097c65-8c8e-435d-8f97-f717edd603a5-config-data-custom\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.765667 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-combined-ca-bundle\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.766850 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b41b51-5490-42b8-9f37-837c9c8a3c2d-config-data\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.770007 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r955k\" (UniqueName: \"kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k\") pod \"dnsmasq-dns-5d8ffddd5c-l4qnp\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.780944 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4wl\" (UniqueName: \"kubernetes.io/projected/04b41b51-5490-42b8-9f37-837c9c8a3c2d-kube-api-access-6n4wl\") pod \"barbican-keystone-listener-7d7f6798b6-7qknd\" (UID: \"04b41b51-5490-42b8-9f37-837c9c8a3c2d\") " pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.782546 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftj4\" (UniqueName: \"kubernetes.io/projected/1b097c65-8c8e-435d-8f97-f717edd603a5-kube-api-access-fftj4\") pod \"barbican-api-74489445b6-mqcrt\" (UID: \"1b097c65-8c8e-435d-8f97-f717edd603a5\") " pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.813828 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.948969 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:06 crc kubenswrapper[4895]: I1202 08:54:06.962283 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:07 crc kubenswrapper[4895]: I1202 08:54:07.221347 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cfc66c54c-2fd8f"] Dec 02 08:54:07 crc kubenswrapper[4895]: W1202 08:54:07.228064 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a651bae_3fe4_4805_8db1_ef32665084e8.slice/crio-c8ed23ac417752e7a1ffc53f598f4652c15956aca27021ae9d8a73c6fe01c5de WatchSource:0}: Error finding container c8ed23ac417752e7a1ffc53f598f4652c15956aca27021ae9d8a73c6fe01c5de: Status 404 returned error can't find the container with id c8ed23ac417752e7a1ffc53f598f4652c15956aca27021ae9d8a73c6fe01c5de Dec 02 08:54:07 crc kubenswrapper[4895]: I1202 08:54:07.275551 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:07 crc kubenswrapper[4895]: W1202 08:54:07.287627 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46cd1ecc_48c3_4e7a_812e_1fd8e7ce3f9c.slice/crio-b2c7e0539016ee513484d9ee60753167fd438b673b1e0be3f5a1657bfd7e06d4 WatchSource:0}: Error finding container b2c7e0539016ee513484d9ee60753167fd438b673b1e0be3f5a1657bfd7e06d4: Status 404 returned error can't find the container with id b2c7e0539016ee513484d9ee60753167fd438b673b1e0be3f5a1657bfd7e06d4 Dec 02 08:54:07 crc kubenswrapper[4895]: I1202 08:54:07.313332 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d7f6798b6-7qknd"] Dec 02 08:54:07 crc kubenswrapper[4895]: W1202 08:54:07.323666 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b41b51_5490_42b8_9f37_837c9c8a3c2d.slice/crio-1aaad0546f45128c2ee3d28bc3d3608d5c95e2ca757c2aa61cf1f782a0f5222d WatchSource:0}: Error finding container 1aaad0546f45128c2ee3d28bc3d3608d5c95e2ca757c2aa61cf1f782a0f5222d: Status 404 returned error can't find the container with id 1aaad0546f45128c2ee3d28bc3d3608d5c95e2ca757c2aa61cf1f782a0f5222d Dec 02 08:54:07 crc kubenswrapper[4895]: I1202 08:54:07.531528 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74489445b6-mqcrt"] Dec 02 08:54:07 crc kubenswrapper[4895]: W1202 08:54:07.539254 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b097c65_8c8e_435d_8f97_f717edd603a5.slice/crio-9a0b63eeb5539b81ce03b3154236c65398dc7fa1ebbf6d439ae0343f5da62afe WatchSource:0}: Error finding container 9a0b63eeb5539b81ce03b3154236c65398dc7fa1ebbf6d439ae0343f5da62afe: Status 404 returned error can't find the container with id 9a0b63eeb5539b81ce03b3154236c65398dc7fa1ebbf6d439ae0343f5da62afe Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.178078 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" event={"ID":"04b41b51-5490-42b8-9f37-837c9c8a3c2d","Type":"ContainerStarted","Data":"f2ebf14c9ef82275ede630d20f89a7dd4a4039636df24609cb863ac91653cef0"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.178467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" event={"ID":"04b41b51-5490-42b8-9f37-837c9c8a3c2d","Type":"ContainerStarted","Data":"96af5f7cf65adc1c674ed6b9cfaab2f3b2da37f5a5dc418dbce791dc5b1cb22d"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.178484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" event={"ID":"04b41b51-5490-42b8-9f37-837c9c8a3c2d","Type":"ContainerStarted","Data":"1aaad0546f45128c2ee3d28bc3d3608d5c95e2ca757c2aa61cf1f782a0f5222d"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.181598 4895 generic.go:334] "Generic (PLEG): container finished" podID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerID="49bc79ef3a6e9b4c1f3b026be3b3a6b97e7820855ee514abeb3c2755a7503028" exitCode=0 Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.181716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" event={"ID":"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c","Type":"ContainerDied","Data":"49bc79ef3a6e9b4c1f3b026be3b3a6b97e7820855ee514abeb3c2755a7503028"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.181828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" event={"ID":"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c","Type":"ContainerStarted","Data":"b2c7e0539016ee513484d9ee60753167fd438b673b1e0be3f5a1657bfd7e06d4"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.189587 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74489445b6-mqcrt" event={"ID":"1b097c65-8c8e-435d-8f97-f717edd603a5","Type":"ContainerStarted","Data":"15bb7c48c68c7c3074b4c0ab00b26f7290e61ba66e7aed062e25137cb3cf76db"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.189640 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74489445b6-mqcrt" event={"ID":"1b097c65-8c8e-435d-8f97-f717edd603a5","Type":"ContainerStarted","Data":"cbb5dbec39626d65c9b7702e15cb09190f0dd288214aab186cab31c6b3512ed3"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.189653 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74489445b6-mqcrt" event={"ID":"1b097c65-8c8e-435d-8f97-f717edd603a5","Type":"ContainerStarted","Data":"9a0b63eeb5539b81ce03b3154236c65398dc7fa1ebbf6d439ae0343f5da62afe"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.190384 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.190418 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.198261 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d7f6798b6-7qknd" podStartSLOduration=2.198220965 podStartE2EDuration="2.198220965s" podCreationTimestamp="2025-12-02 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:08.197682058 +0000 UTC m=+5459.368541671" watchObservedRunningTime="2025-12-02 08:54:08.198220965 +0000 UTC m=+5459.369080588" Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.201366 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" event={"ID":"3a651bae-3fe4-4805-8db1-ef32665084e8","Type":"ContainerStarted","Data":"a938c34dcdffd77bcb470e1d7679b803d67ad562b5372cc0266a45c3177c20ae"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.201431 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" event={"ID":"3a651bae-3fe4-4805-8db1-ef32665084e8","Type":"ContainerStarted","Data":"e1a3f96e7276f32f86842948958b3b4d5c38c168d8320daa644734a37482e090"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.201444 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" event={"ID":"3a651bae-3fe4-4805-8db1-ef32665084e8","Type":"ContainerStarted","Data":"c8ed23ac417752e7a1ffc53f598f4652c15956aca27021ae9d8a73c6fe01c5de"} Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.365490 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-74489445b6-mqcrt" podStartSLOduration=2.365468468 podStartE2EDuration="2.365468468s" podCreationTimestamp="2025-12-02 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:08.362222506 +0000 UTC m=+5459.533082129" watchObservedRunningTime="2025-12-02 08:54:08.365468468 +0000 UTC m=+5459.536328081" Dec 02 08:54:08 crc kubenswrapper[4895]: I1202 08:54:08.386177 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cfc66c54c-2fd8f" podStartSLOduration=2.386148141 podStartE2EDuration="2.386148141s" podCreationTimestamp="2025-12-02 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:08.380257108 +0000 UTC m=+5459.551116731" watchObservedRunningTime="2025-12-02 08:54:08.386148141 +0000 UTC m=+5459.557007754" Dec 02 08:54:09 crc kubenswrapper[4895]: I1202 08:54:09.214348 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" event={"ID":"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c","Type":"ContainerStarted","Data":"30ae8fb05bff460fc03435118cb2277ee572a00d13210037531c1b51af875ef6"} Dec 02 08:54:09 crc kubenswrapper[4895]: I1202 08:54:09.238938 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" podStartSLOduration=3.238918331 podStartE2EDuration="3.238918331s" podCreationTimestamp="2025-12-02 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:09.23566619 +0000 UTC m=+5460.406525803" watchObservedRunningTime="2025-12-02 08:54:09.238918331 +0000 UTC m=+5460.409777944" Dec 02 08:54:10 crc kubenswrapper[4895]: I1202 08:54:10.225098 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:16 crc kubenswrapper[4895]: I1202 08:54:16.951008 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.012126 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.012433 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="dnsmasq-dns" containerID="cri-o://044ac01be679ca0f730a7d3890c4b5c48e0a04c2d42e95fd625b0922ed03c99e" gracePeriod=10 Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.313944 4895 generic.go:334] "Generic (PLEG): container finished" podID="45adac47-56b1-42f9-82d3-65341c6446a5" containerID="044ac01be679ca0f730a7d3890c4b5c48e0a04c2d42e95fd625b0922ed03c99e" exitCode=0 Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.314348 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" event={"ID":"45adac47-56b1-42f9-82d3-65341c6446a5","Type":"ContainerDied","Data":"044ac01be679ca0f730a7d3890c4b5c48e0a04c2d42e95fd625b0922ed03c99e"} Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.487413 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.624483 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config\") pod \"45adac47-56b1-42f9-82d3-65341c6446a5\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.624595 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64r4\" (UniqueName: \"kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4\") pod \"45adac47-56b1-42f9-82d3-65341c6446a5\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.624662 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb\") pod \"45adac47-56b1-42f9-82d3-65341c6446a5\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.624756 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc\") pod \"45adac47-56b1-42f9-82d3-65341c6446a5\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.624852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb\") pod \"45adac47-56b1-42f9-82d3-65341c6446a5\" (UID: \"45adac47-56b1-42f9-82d3-65341c6446a5\") " Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.647102 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4" (OuterVolumeSpecName: "kube-api-access-s64r4") pod "45adac47-56b1-42f9-82d3-65341c6446a5" (UID: "45adac47-56b1-42f9-82d3-65341c6446a5"). InnerVolumeSpecName "kube-api-access-s64r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.674141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config" (OuterVolumeSpecName: "config") pod "45adac47-56b1-42f9-82d3-65341c6446a5" (UID: "45adac47-56b1-42f9-82d3-65341c6446a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.677415 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45adac47-56b1-42f9-82d3-65341c6446a5" (UID: "45adac47-56b1-42f9-82d3-65341c6446a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.679540 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45adac47-56b1-42f9-82d3-65341c6446a5" (UID: "45adac47-56b1-42f9-82d3-65341c6446a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.684672 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45adac47-56b1-42f9-82d3-65341c6446a5" (UID: "45adac47-56b1-42f9-82d3-65341c6446a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.726692 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.726727 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.726755 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s64r4\" (UniqueName: \"kubernetes.io/projected/45adac47-56b1-42f9-82d3-65341c6446a5-kube-api-access-s64r4\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.726765 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:17 crc kubenswrapper[4895]: I1202 08:54:17.726774 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45adac47-56b1-42f9-82d3-65341c6446a5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.326314 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" event={"ID":"45adac47-56b1-42f9-82d3-65341c6446a5","Type":"ContainerDied","Data":"a307a1c60c2072d6603d2cdc538551b7f83152d2f900699299091cea770107a3"} Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.326768 4895 scope.go:117] "RemoveContainer" containerID="044ac01be679ca0f730a7d3890c4b5c48e0a04c2d42e95fd625b0922ed03c99e" Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.326384 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.353500 4895 scope.go:117] "RemoveContainer" containerID="99d316631338e3a7ef1ffcc42362375c1e56430558a3255199b1ceb7ef8eea9d" Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.372391 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.381122 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-597fd75467-lvrkg"] Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.430335 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:18 crc kubenswrapper[4895]: I1202 08:54:18.490233 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74489445b6-mqcrt" Dec 02 08:54:19 crc kubenswrapper[4895]: I1202 08:54:19.155330 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" path="/var/lib/kubelet/pods/45adac47-56b1-42f9-82d3-65341c6446a5/volumes" Dec 02 08:54:22 crc kubenswrapper[4895]: I1202 08:54:22.390109 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-597fd75467-lvrkg" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.17:5353: i/o timeout" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.589517 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qj5cd"] Dec 02 08:54:30 crc kubenswrapper[4895]: E1202 08:54:30.590544 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="dnsmasq-dns" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.590568 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="dnsmasq-dns" Dec 02 08:54:30 crc kubenswrapper[4895]: E1202 08:54:30.590588 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="init" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.590594 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="init" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.590795 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="45adac47-56b1-42f9-82d3-65341c6446a5" containerName="dnsmasq-dns" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.591431 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.599418 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qj5cd"] Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.629967 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clsn\" (UniqueName: \"kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.630105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.708037 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-323e-account-create-update-w4q2t"] Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.709248 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.712529 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.720433 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-323e-account-create-update-w4q2t"] Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.731798 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.731945 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clsn\" (UniqueName: \"kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.732884 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.758333 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clsn\" (UniqueName: \"kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn\") pod \"neutron-db-create-qj5cd\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.833695 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.834425 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt58w\" (UniqueName: \"kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.936558 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt58w\" (UniqueName: \"kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.936645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.937396 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.957165 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:30 crc kubenswrapper[4895]: I1202 08:54:30.975587 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt58w\" (UniqueName: \"kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w\") pod \"neutron-323e-account-create-update-w4q2t\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:31 crc kubenswrapper[4895]: I1202 08:54:31.027808 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:31 crc kubenswrapper[4895]: I1202 08:54:31.406292 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qj5cd"] Dec 02 08:54:31 crc kubenswrapper[4895]: W1202 08:54:31.411710 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1c10dc0_d131_46da_b015_2f1dc1843723.slice/crio-2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b WatchSource:0}: Error finding container 2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b: Status 404 returned error can't find the container with id 2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b Dec 02 08:54:31 crc kubenswrapper[4895]: I1202 08:54:31.510885 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-323e-account-create-update-w4q2t"] Dec 02 08:54:31 crc kubenswrapper[4895]: I1202 08:54:31.526305 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj5cd" event={"ID":"f1c10dc0-d131-46da-b015-2f1dc1843723","Type":"ContainerStarted","Data":"2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b"} Dec 02 08:54:31 crc kubenswrapper[4895]: I1202 08:54:31.528918 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-323e-account-create-update-w4q2t" event={"ID":"503a1114-4530-4147-950a-efa451c46545","Type":"ContainerStarted","Data":"2b6982388c255d1d6bc61e2907f26d36fe9adeac5066350e61b92e88c0b5a1bf"} Dec 02 08:54:32 crc kubenswrapper[4895]: I1202 08:54:32.540159 4895 generic.go:334] "Generic (PLEG): container finished" podID="f1c10dc0-d131-46da-b015-2f1dc1843723" containerID="33fa1fce993d47df6ec702c7fc2ed2eaf36ab13d08ddef01a3555be0b0638d61" exitCode=0 Dec 02 08:54:32 crc kubenswrapper[4895]: I1202 08:54:32.540256 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj5cd" event={"ID":"f1c10dc0-d131-46da-b015-2f1dc1843723","Type":"ContainerDied","Data":"33fa1fce993d47df6ec702c7fc2ed2eaf36ab13d08ddef01a3555be0b0638d61"} Dec 02 08:54:32 crc kubenswrapper[4895]: I1202 08:54:32.543942 4895 generic.go:334] "Generic (PLEG): container finished" podID="503a1114-4530-4147-950a-efa451c46545" containerID="479aa7736c632ffde4307b61d4f05e5599c0104a4fe2368c24a895db3c4bf37d" exitCode=0 Dec 02 08:54:32 crc kubenswrapper[4895]: I1202 08:54:32.544011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-323e-account-create-update-w4q2t" event={"ID":"503a1114-4530-4147-950a-efa451c46545","Type":"ContainerDied","Data":"479aa7736c632ffde4307b61d4f05e5599c0104a4fe2368c24a895db3c4bf37d"} Dec 02 08:54:33 crc kubenswrapper[4895]: I1202 08:54:33.931080 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.002719 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clsn\" (UniqueName: \"kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn\") pod \"f1c10dc0-d131-46da-b015-2f1dc1843723\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.002870 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts\") pod \"f1c10dc0-d131-46da-b015-2f1dc1843723\" (UID: \"f1c10dc0-d131-46da-b015-2f1dc1843723\") " Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.003572 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1c10dc0-d131-46da-b015-2f1dc1843723" (UID: "f1c10dc0-d131-46da-b015-2f1dc1843723"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.008539 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn" (OuterVolumeSpecName: "kube-api-access-6clsn") pod "f1c10dc0-d131-46da-b015-2f1dc1843723" (UID: "f1c10dc0-d131-46da-b015-2f1dc1843723"). InnerVolumeSpecName "kube-api-access-6clsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.012928 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.104292 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts\") pod \"503a1114-4530-4147-950a-efa451c46545\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.104469 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt58w\" (UniqueName: \"kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w\") pod \"503a1114-4530-4147-950a-efa451c46545\" (UID: \"503a1114-4530-4147-950a-efa451c46545\") " Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.104879 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clsn\" (UniqueName: \"kubernetes.io/projected/f1c10dc0-d131-46da-b015-2f1dc1843723-kube-api-access-6clsn\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.104898 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1c10dc0-d131-46da-b015-2f1dc1843723-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.106158 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "503a1114-4530-4147-950a-efa451c46545" (UID: "503a1114-4530-4147-950a-efa451c46545"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.109818 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w" (OuterVolumeSpecName: "kube-api-access-mt58w") pod "503a1114-4530-4147-950a-efa451c46545" (UID: "503a1114-4530-4147-950a-efa451c46545"). InnerVolumeSpecName "kube-api-access-mt58w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.207346 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/503a1114-4530-4147-950a-efa451c46545-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.207382 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt58w\" (UniqueName: \"kubernetes.io/projected/503a1114-4530-4147-950a-efa451c46545-kube-api-access-mt58w\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.565884 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-323e-account-create-update-w4q2t" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.565904 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-323e-account-create-update-w4q2t" event={"ID":"503a1114-4530-4147-950a-efa451c46545","Type":"ContainerDied","Data":"2b6982388c255d1d6bc61e2907f26d36fe9adeac5066350e61b92e88c0b5a1bf"} Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.566347 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6982388c255d1d6bc61e2907f26d36fe9adeac5066350e61b92e88c0b5a1bf" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.568431 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj5cd" event={"ID":"f1c10dc0-d131-46da-b015-2f1dc1843723","Type":"ContainerDied","Data":"2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b"} Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.568472 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2982b5c8fe9bf11df0a9e706792947ae0f87547ed0ff79d24a7ab08bc429656b" Dec 02 08:54:34 crc kubenswrapper[4895]: I1202 08:54:34.568595 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj5cd" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.474130 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.474889 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.888192 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z2sln"] Dec 02 08:54:35 crc kubenswrapper[4895]: E1202 08:54:35.888710 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c10dc0-d131-46da-b015-2f1dc1843723" containerName="mariadb-database-create" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.888751 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c10dc0-d131-46da-b015-2f1dc1843723" containerName="mariadb-database-create" Dec 02 08:54:35 crc kubenswrapper[4895]: E1202 08:54:35.888792 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503a1114-4530-4147-950a-efa451c46545" containerName="mariadb-account-create-update" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.888803 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="503a1114-4530-4147-950a-efa451c46545" containerName="mariadb-account-create-update" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.889008 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="503a1114-4530-4147-950a-efa451c46545" containerName="mariadb-account-create-update" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.889030 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c10dc0-d131-46da-b015-2f1dc1843723" containerName="mariadb-database-create" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.889905 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.893040 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.893288 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.897162 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bqc86" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.901620 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z2sln"] Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.937627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qsc\" (UniqueName: \"kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.938358 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:35 crc kubenswrapper[4895]: I1202 08:54:35.938521 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.040202 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.040283 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.040371 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qsc\" (UniqueName: \"kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.046205 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.046360 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.058077 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qsc\" (UniqueName: \"kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc\") pod \"neutron-db-sync-z2sln\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.209697 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:36 crc kubenswrapper[4895]: I1202 08:54:36.673567 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z2sln"] Dec 02 08:54:37 crc kubenswrapper[4895]: I1202 08:54:37.607247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z2sln" event={"ID":"a00cc999-01c0-4d79-870c-4e84aff41706","Type":"ContainerStarted","Data":"710da3364236afd6e2492aebc51c183d4b941b4e773510efb788e163681e1435"} Dec 02 08:54:37 crc kubenswrapper[4895]: I1202 08:54:37.608168 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z2sln" event={"ID":"a00cc999-01c0-4d79-870c-4e84aff41706","Type":"ContainerStarted","Data":"683f37c703fe267d5ea303b52e85f05a0c210a68f19d773e43dd6ce406699e18"} Dec 02 08:54:37 crc kubenswrapper[4895]: I1202 08:54:37.641149 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z2sln" podStartSLOduration=2.64111881 podStartE2EDuration="2.64111881s" podCreationTimestamp="2025-12-02 08:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:37.63693662 +0000 UTC m=+5488.807796263" watchObservedRunningTime="2025-12-02 08:54:37.64111881 +0000 UTC m=+5488.811978443" Dec 02 08:54:41 crc kubenswrapper[4895]: I1202 08:54:41.659827 4895 generic.go:334] "Generic (PLEG): container finished" podID="a00cc999-01c0-4d79-870c-4e84aff41706" containerID="710da3364236afd6e2492aebc51c183d4b941b4e773510efb788e163681e1435" exitCode=0 Dec 02 08:54:41 crc kubenswrapper[4895]: I1202 08:54:41.659988 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z2sln" event={"ID":"a00cc999-01c0-4d79-870c-4e84aff41706","Type":"ContainerDied","Data":"710da3364236afd6e2492aebc51c183d4b941b4e773510efb788e163681e1435"} Dec 02 08:54:42 crc kubenswrapper[4895]: I1202 08:54:42.970146 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.066375 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8qsc\" (UniqueName: \"kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc\") pod \"a00cc999-01c0-4d79-870c-4e84aff41706\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.067166 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle\") pod \"a00cc999-01c0-4d79-870c-4e84aff41706\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.067287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config\") pod \"a00cc999-01c0-4d79-870c-4e84aff41706\" (UID: \"a00cc999-01c0-4d79-870c-4e84aff41706\") " Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.075083 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc" (OuterVolumeSpecName: "kube-api-access-h8qsc") pod "a00cc999-01c0-4d79-870c-4e84aff41706" (UID: "a00cc999-01c0-4d79-870c-4e84aff41706"). InnerVolumeSpecName "kube-api-access-h8qsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.097240 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config" (OuterVolumeSpecName: "config") pod "a00cc999-01c0-4d79-870c-4e84aff41706" (UID: "a00cc999-01c0-4d79-870c-4e84aff41706"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.100694 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a00cc999-01c0-4d79-870c-4e84aff41706" (UID: "a00cc999-01c0-4d79-870c-4e84aff41706"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.172087 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8qsc\" (UniqueName: \"kubernetes.io/projected/a00cc999-01c0-4d79-870c-4e84aff41706-kube-api-access-h8qsc\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.172476 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.172492 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a00cc999-01c0-4d79-870c-4e84aff41706-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.677249 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z2sln" event={"ID":"a00cc999-01c0-4d79-870c-4e84aff41706","Type":"ContainerDied","Data":"683f37c703fe267d5ea303b52e85f05a0c210a68f19d773e43dd6ce406699e18"} Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.677293 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="683f37c703fe267d5ea303b52e85f05a0c210a68f19d773e43dd6ce406699e18" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.677344 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z2sln" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.930109 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:54:43 crc kubenswrapper[4895]: E1202 08:54:43.930495 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00cc999-01c0-4d79-870c-4e84aff41706" containerName="neutron-db-sync" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.930511 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00cc999-01c0-4d79-870c-4e84aff41706" containerName="neutron-db-sync" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.930682 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00cc999-01c0-4d79-870c-4e84aff41706" containerName="neutron-db-sync" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.931970 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.951566 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.985360 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.985441 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gprm\" (UniqueName: \"kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.985533 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.985616 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:43 crc kubenswrapper[4895]: I1202 08:54:43.985679 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.064242 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6849c5c4f-gh46j"] Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.066485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.068900 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bqc86" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.069166 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.075009 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.081071 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6849c5c4f-gh46j"] Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.087946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088002 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088053 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088098 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqk2\" (UniqueName: \"kubernetes.io/projected/f410d809-a0e1-465b-a495-868c22d9b9c7-kube-api-access-7nqk2\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088127 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-httpd-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gprm\" (UniqueName: \"kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088183 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088211 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-combined-ca-bundle\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.088251 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.089203 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.089956 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.090603 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.091629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.119240 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gprm\" (UniqueName: \"kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm\") pod \"dnsmasq-dns-64dcccd5c7-mwdjx\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.189098 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-combined-ca-bundle\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.189495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.189608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqk2\" (UniqueName: \"kubernetes.io/projected/f410d809-a0e1-465b-a495-868c22d9b9c7-kube-api-access-7nqk2\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.189654 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-httpd-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.193252 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-combined-ca-bundle\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.194191 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.195016 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f410d809-a0e1-465b-a495-868c22d9b9c7-httpd-config\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.211146 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqk2\" (UniqueName: \"kubernetes.io/projected/f410d809-a0e1-465b-a495-868c22d9b9c7-kube-api-access-7nqk2\") pod \"neutron-6849c5c4f-gh46j\" (UID: \"f410d809-a0e1-465b-a495-868c22d9b9c7\") " pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.250668 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.385590 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.720477 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:54:44 crc kubenswrapper[4895]: W1202 08:54:44.722296 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod343e44b9_20db_43e6_8b44_c78a6159b631.slice/crio-cc40b8d7aab4a10b6a3fd3878a73fee6788e3fc2a5833b8fbaaeaa96ec68a320 WatchSource:0}: Error finding container cc40b8d7aab4a10b6a3fd3878a73fee6788e3fc2a5833b8fbaaeaa96ec68a320: Status 404 returned error can't find the container with id cc40b8d7aab4a10b6a3fd3878a73fee6788e3fc2a5833b8fbaaeaa96ec68a320 Dec 02 08:54:44 crc kubenswrapper[4895]: I1202 08:54:44.962692 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6849c5c4f-gh46j"] Dec 02 08:54:44 crc kubenswrapper[4895]: W1202 08:54:44.968034 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf410d809_a0e1_465b_a495_868c22d9b9c7.slice/crio-ea9705bbc8fdeca8f420c84115dcc04ab2675cde67b4599243cdddc57275908d WatchSource:0}: Error finding container ea9705bbc8fdeca8f420c84115dcc04ab2675cde67b4599243cdddc57275908d: Status 404 returned error can't find the container with id ea9705bbc8fdeca8f420c84115dcc04ab2675cde67b4599243cdddc57275908d Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.701557 4895 generic.go:334] "Generic (PLEG): container finished" podID="343e44b9-20db-43e6-8b44-c78a6159b631" containerID="a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18" exitCode=0 Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.701654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" event={"ID":"343e44b9-20db-43e6-8b44-c78a6159b631","Type":"ContainerDied","Data":"a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18"} Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.702240 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" event={"ID":"343e44b9-20db-43e6-8b44-c78a6159b631","Type":"ContainerStarted","Data":"cc40b8d7aab4a10b6a3fd3878a73fee6788e3fc2a5833b8fbaaeaa96ec68a320"} Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.705216 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6849c5c4f-gh46j" event={"ID":"f410d809-a0e1-465b-a495-868c22d9b9c7","Type":"ContainerStarted","Data":"74bc4eac2fda8c97fac75e74082350e2aa4745d49bd35e0f952ddee940e450f2"} Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.705253 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6849c5c4f-gh46j" event={"ID":"f410d809-a0e1-465b-a495-868c22d9b9c7","Type":"ContainerStarted","Data":"4ab9463d29e42234f725963bcc3d0c5a48ad746e69655fb25cf225e835fa5468"} Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.705264 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6849c5c4f-gh46j" event={"ID":"f410d809-a0e1-465b-a495-868c22d9b9c7","Type":"ContainerStarted","Data":"ea9705bbc8fdeca8f420c84115dcc04ab2675cde67b4599243cdddc57275908d"} Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.705362 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:54:45 crc kubenswrapper[4895]: I1202 08:54:45.755589 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6849c5c4f-gh46j" podStartSLOduration=1.755565672 podStartE2EDuration="1.755565672s" podCreationTimestamp="2025-12-02 08:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:45.744774566 +0000 UTC m=+5496.915634189" watchObservedRunningTime="2025-12-02 08:54:45.755565672 +0000 UTC m=+5496.926425285" Dec 02 08:54:46 crc kubenswrapper[4895]: I1202 08:54:46.727062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" event={"ID":"343e44b9-20db-43e6-8b44-c78a6159b631","Type":"ContainerStarted","Data":"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e"} Dec 02 08:54:46 crc kubenswrapper[4895]: I1202 08:54:46.727561 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:46 crc kubenswrapper[4895]: I1202 08:54:46.756599 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" podStartSLOduration=3.756573134 podStartE2EDuration="3.756573134s" podCreationTimestamp="2025-12-02 08:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:54:46.747717748 +0000 UTC m=+5497.918577371" watchObservedRunningTime="2025-12-02 08:54:46.756573134 +0000 UTC m=+5497.927432737" Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.253010 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.333350 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.333731 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="dnsmasq-dns" containerID="cri-o://30ae8fb05bff460fc03435118cb2277ee572a00d13210037531c1b51af875ef6" gracePeriod=10 Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.804264 4895 generic.go:334] "Generic (PLEG): container finished" podID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerID="30ae8fb05bff460fc03435118cb2277ee572a00d13210037531c1b51af875ef6" exitCode=0 Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.804458 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" event={"ID":"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c","Type":"ContainerDied","Data":"30ae8fb05bff460fc03435118cb2277ee572a00d13210037531c1b51af875ef6"} Dec 02 08:54:54 crc kubenswrapper[4895]: I1202 08:54:54.945780 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.149309 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc\") pod \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.149825 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config\") pod \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.149947 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb\") pod \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.150052 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb\") pod \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.150121 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r955k\" (UniqueName: \"kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k\") pod \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\" (UID: \"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c\") " Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.170183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k" (OuterVolumeSpecName: "kube-api-access-r955k") pod "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" (UID: "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c"). InnerVolumeSpecName "kube-api-access-r955k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.199234 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" (UID: "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.202787 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" (UID: "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.207917 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config" (OuterVolumeSpecName: "config") pod "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" (UID: "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.214616 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" (UID: "46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.252183 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.252344 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r955k\" (UniqueName: \"kubernetes.io/projected/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-kube-api-access-r955k\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.252369 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.252397 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.252408 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.816280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" event={"ID":"46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c","Type":"ContainerDied","Data":"b2c7e0539016ee513484d9ee60753167fd438b673b1e0be3f5a1657bfd7e06d4"} Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.816348 4895 scope.go:117] "RemoveContainer" containerID="30ae8fb05bff460fc03435118cb2277ee572a00d13210037531c1b51af875ef6" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.816360 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8ffddd5c-l4qnp" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.864121 4895 scope.go:117] "RemoveContainer" containerID="49bc79ef3a6e9b4c1f3b026be3b3a6b97e7820855ee514abeb3c2755a7503028" Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.877309 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:55 crc kubenswrapper[4895]: I1202 08:54:55.885934 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d8ffddd5c-l4qnp"] Dec 02 08:54:57 crc kubenswrapper[4895]: I1202 08:54:57.150207 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" path="/var/lib/kubelet/pods/46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c/volumes" Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.473137 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.473867 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.473920 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.474719 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.474803 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec" gracePeriod=600 Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.922090 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec" exitCode=0 Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.922171 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec"} Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.922794 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812"} Dec 02 08:55:05 crc kubenswrapper[4895]: I1202 08:55:05.922827 4895 scope.go:117] "RemoveContainer" containerID="5f3bd2c356ea4faeb32e1c7dba83504ac381c8d421d130309e1326c5000e3875" Dec 02 08:55:14 crc kubenswrapper[4895]: I1202 08:55:14.404149 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6849c5c4f-gh46j" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.611495 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-m8rvv"] Dec 02 08:55:21 crc kubenswrapper[4895]: E1202 08:55:21.612804 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="dnsmasq-dns" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.612826 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="dnsmasq-dns" Dec 02 08:55:21 crc kubenswrapper[4895]: E1202 08:55:21.612843 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="init" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.612855 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="init" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.613085 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="46cd1ecc-48c3-4e7a-812e-1fd8e7ce3f9c" containerName="dnsmasq-dns" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.613992 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.618757 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4006-account-create-update-xvvqf"] Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.620249 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.622071 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.626121 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m8rvv"] Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.634511 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4006-account-create-update-xvvqf"] Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.636368 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.636551 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.636877 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwm7l\" (UniqueName: \"kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.636915 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz58n\" (UniqueName: \"kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.739495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.739598 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.739690 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwm7l\" (UniqueName: \"kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.739711 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz58n\" (UniqueName: \"kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.740622 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.740640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.759380 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwm7l\" (UniqueName: \"kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l\") pod \"glance-4006-account-create-update-xvvqf\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.759396 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz58n\" (UniqueName: \"kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n\") pod \"glance-db-create-m8rvv\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.947643 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:21 crc kubenswrapper[4895]: I1202 08:55:21.958873 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:22 crc kubenswrapper[4895]: I1202 08:55:22.494077 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m8rvv"] Dec 02 08:55:22 crc kubenswrapper[4895]: I1202 08:55:22.501732 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4006-account-create-update-xvvqf"] Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.155448 4895 generic.go:334] "Generic (PLEG): container finished" podID="38805e40-5f7b-4d9d-91fc-e70e17f03233" containerID="021e0aeb1985d12cd5381159443f360933eadaed7bc26a2ccaa8c0e3e9cdc374" exitCode=0 Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.155762 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8rvv" event={"ID":"38805e40-5f7b-4d9d-91fc-e70e17f03233","Type":"ContainerDied","Data":"021e0aeb1985d12cd5381159443f360933eadaed7bc26a2ccaa8c0e3e9cdc374"} Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.155963 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8rvv" event={"ID":"38805e40-5f7b-4d9d-91fc-e70e17f03233","Type":"ContainerStarted","Data":"c504415755b6ed38016142af76610f46f1b6a0cfaf3d6425473d4b3483651194"} Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.157668 4895 generic.go:334] "Generic (PLEG): container finished" podID="03f8b53e-94ed-4cd6-a863-c78912879a7f" containerID="cab76992dee572ef2fccea4bd55f2f55d7beee4f58370f7d39c4e77688a8710d" exitCode=0 Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.157705 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4006-account-create-update-xvvqf" event={"ID":"03f8b53e-94ed-4cd6-a863-c78912879a7f","Type":"ContainerDied","Data":"cab76992dee572ef2fccea4bd55f2f55d7beee4f58370f7d39c4e77688a8710d"} Dec 02 08:55:23 crc kubenswrapper[4895]: I1202 08:55:23.157726 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4006-account-create-update-xvvqf" event={"ID":"03f8b53e-94ed-4cd6-a863-c78912879a7f","Type":"ContainerStarted","Data":"9b6194c04785da81ae6ef1f786c82d6f04caf39e9b597f17408045b7ba53bb24"} Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.570695 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.579458 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.608861 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwm7l\" (UniqueName: \"kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l\") pod \"03f8b53e-94ed-4cd6-a863-c78912879a7f\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.608916 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts\") pod \"38805e40-5f7b-4d9d-91fc-e70e17f03233\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.610008 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38805e40-5f7b-4d9d-91fc-e70e17f03233" (UID: "38805e40-5f7b-4d9d-91fc-e70e17f03233"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.660766 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l" (OuterVolumeSpecName: "kube-api-access-xwm7l") pod "03f8b53e-94ed-4cd6-a863-c78912879a7f" (UID: "03f8b53e-94ed-4cd6-a863-c78912879a7f"). InnerVolumeSpecName "kube-api-access-xwm7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.710234 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz58n\" (UniqueName: \"kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n\") pod \"38805e40-5f7b-4d9d-91fc-e70e17f03233\" (UID: \"38805e40-5f7b-4d9d-91fc-e70e17f03233\") " Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.710447 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts\") pod \"03f8b53e-94ed-4cd6-a863-c78912879a7f\" (UID: \"03f8b53e-94ed-4cd6-a863-c78912879a7f\") " Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.710997 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwm7l\" (UniqueName: \"kubernetes.io/projected/03f8b53e-94ed-4cd6-a863-c78912879a7f-kube-api-access-xwm7l\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.711026 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38805e40-5f7b-4d9d-91fc-e70e17f03233-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.711508 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03f8b53e-94ed-4cd6-a863-c78912879a7f" (UID: "03f8b53e-94ed-4cd6-a863-c78912879a7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.715166 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n" (OuterVolumeSpecName: "kube-api-access-sz58n") pod "38805e40-5f7b-4d9d-91fc-e70e17f03233" (UID: "38805e40-5f7b-4d9d-91fc-e70e17f03233"). InnerVolumeSpecName "kube-api-access-sz58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.812782 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz58n\" (UniqueName: \"kubernetes.io/projected/38805e40-5f7b-4d9d-91fc-e70e17f03233-kube-api-access-sz58n\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:24 crc kubenswrapper[4895]: I1202 08:55:24.812835 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03f8b53e-94ed-4cd6-a863-c78912879a7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.176736 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8rvv" event={"ID":"38805e40-5f7b-4d9d-91fc-e70e17f03233","Type":"ContainerDied","Data":"c504415755b6ed38016142af76610f46f1b6a0cfaf3d6425473d4b3483651194"} Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.176877 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c504415755b6ed38016142af76610f46f1b6a0cfaf3d6425473d4b3483651194" Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.176948 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8rvv" Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.177717 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4006-account-create-update-xvvqf" event={"ID":"03f8b53e-94ed-4cd6-a863-c78912879a7f","Type":"ContainerDied","Data":"9b6194c04785da81ae6ef1f786c82d6f04caf39e9b597f17408045b7ba53bb24"} Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.177752 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b6194c04785da81ae6ef1f786c82d6f04caf39e9b597f17408045b7ba53bb24" Dec 02 08:55:25 crc kubenswrapper[4895]: I1202 08:55:25.177788 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4006-account-create-update-xvvqf" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.831878 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zmtns"] Dec 02 08:55:26 crc kubenswrapper[4895]: E1202 08:55:26.832665 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8b53e-94ed-4cd6-a863-c78912879a7f" containerName="mariadb-account-create-update" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.832680 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8b53e-94ed-4cd6-a863-c78912879a7f" containerName="mariadb-account-create-update" Dec 02 08:55:26 crc kubenswrapper[4895]: E1202 08:55:26.832703 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38805e40-5f7b-4d9d-91fc-e70e17f03233" containerName="mariadb-database-create" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.832710 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="38805e40-5f7b-4d9d-91fc-e70e17f03233" containerName="mariadb-database-create" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.832964 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8b53e-94ed-4cd6-a863-c78912879a7f" containerName="mariadb-account-create-update" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.832980 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="38805e40-5f7b-4d9d-91fc-e70e17f03233" containerName="mariadb-database-create" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.833730 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.836334 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-thr6k" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.846583 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.848207 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zmtns"] Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.960921 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.960988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.961209 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58nhf\" (UniqueName: \"kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:26 crc kubenswrapper[4895]: I1202 08:55:26.961470 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.063062 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.063139 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58nhf\" (UniqueName: \"kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.063210 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.063319 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.068306 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.068347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.069365 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.081719 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58nhf\" (UniqueName: \"kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf\") pod \"glance-db-sync-zmtns\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.164632 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:27 crc kubenswrapper[4895]: I1202 08:55:27.774598 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zmtns"] Dec 02 08:55:28 crc kubenswrapper[4895]: I1202 08:55:28.204175 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zmtns" event={"ID":"49929bec-a95b-40ab-b9bd-b162ecbee391","Type":"ContainerStarted","Data":"70e0278d1346515fbd94e5f768dbe34e6add7088e35e2fe33a64ca00a95ae57e"} Dec 02 08:55:29 crc kubenswrapper[4895]: I1202 08:55:29.216026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zmtns" event={"ID":"49929bec-a95b-40ab-b9bd-b162ecbee391","Type":"ContainerStarted","Data":"8072ee769b8e8452e2eccecfc680da0cab8d9f6d03cba69fb765cff36cb4e93c"} Dec 02 08:55:29 crc kubenswrapper[4895]: I1202 08:55:29.238481 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zmtns" podStartSLOduration=3.238453334 podStartE2EDuration="3.238453334s" podCreationTimestamp="2025-12-02 08:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:29.233977354 +0000 UTC m=+5540.404836977" watchObservedRunningTime="2025-12-02 08:55:29.238453334 +0000 UTC m=+5540.409312957" Dec 02 08:55:32 crc kubenswrapper[4895]: I1202 08:55:32.255604 4895 generic.go:334] "Generic (PLEG): container finished" podID="49929bec-a95b-40ab-b9bd-b162ecbee391" containerID="8072ee769b8e8452e2eccecfc680da0cab8d9f6d03cba69fb765cff36cb4e93c" exitCode=0 Dec 02 08:55:32 crc kubenswrapper[4895]: I1202 08:55:32.255706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zmtns" event={"ID":"49929bec-a95b-40ab-b9bd-b162ecbee391","Type":"ContainerDied","Data":"8072ee769b8e8452e2eccecfc680da0cab8d9f6d03cba69fb765cff36cb4e93c"} Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.625064 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.809928 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data\") pod \"49929bec-a95b-40ab-b9bd-b162ecbee391\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.810478 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data\") pod \"49929bec-a95b-40ab-b9bd-b162ecbee391\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.810529 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle\") pod \"49929bec-a95b-40ab-b9bd-b162ecbee391\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.810611 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58nhf\" (UniqueName: \"kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf\") pod \"49929bec-a95b-40ab-b9bd-b162ecbee391\" (UID: \"49929bec-a95b-40ab-b9bd-b162ecbee391\") " Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.817102 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "49929bec-a95b-40ab-b9bd-b162ecbee391" (UID: "49929bec-a95b-40ab-b9bd-b162ecbee391"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.817110 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf" (OuterVolumeSpecName: "kube-api-access-58nhf") pod "49929bec-a95b-40ab-b9bd-b162ecbee391" (UID: "49929bec-a95b-40ab-b9bd-b162ecbee391"). InnerVolumeSpecName "kube-api-access-58nhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.837064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49929bec-a95b-40ab-b9bd-b162ecbee391" (UID: "49929bec-a95b-40ab-b9bd-b162ecbee391"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.862768 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data" (OuterVolumeSpecName: "config-data") pod "49929bec-a95b-40ab-b9bd-b162ecbee391" (UID: "49929bec-a95b-40ab-b9bd-b162ecbee391"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.913153 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.913192 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.913205 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49929bec-a95b-40ab-b9bd-b162ecbee391-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:33 crc kubenswrapper[4895]: I1202 08:55:33.913217 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58nhf\" (UniqueName: \"kubernetes.io/projected/49929bec-a95b-40ab-b9bd-b162ecbee391-kube-api-access-58nhf\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.278975 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zmtns" event={"ID":"49929bec-a95b-40ab-b9bd-b162ecbee391","Type":"ContainerDied","Data":"70e0278d1346515fbd94e5f768dbe34e6add7088e35e2fe33a64ca00a95ae57e"} Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.279026 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e0278d1346515fbd94e5f768dbe34e6add7088e35e2fe33a64ca00a95ae57e" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.279071 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zmtns" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.598310 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:34 crc kubenswrapper[4895]: E1202 08:55:34.598767 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49929bec-a95b-40ab-b9bd-b162ecbee391" containerName="glance-db-sync" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.598784 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="49929bec-a95b-40ab-b9bd-b162ecbee391" containerName="glance-db-sync" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.598964 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="49929bec-a95b-40ab-b9bd-b162ecbee391" containerName="glance-db-sync" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.600274 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.604143 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-thr6k" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.604406 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.604518 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.605698 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.617359 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.728869 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729267 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhf96\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729290 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729312 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729352 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729407 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.729431 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.738670 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.740517 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.751645 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.838838 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.838963 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhf96\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.839388 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.839444 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.840154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.840232 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.840281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.841027 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.841084 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.848481 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.850437 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.859924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.860483 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.873646 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhf96\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96\") pod \"glance-default-external-api-0\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.919403 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.942630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.942696 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.942724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp656\" (UniqueName: \"kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.942802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:34 crc kubenswrapper[4895]: I1202 08:55:34.943489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.045518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.045708 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.045754 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.045777 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp656\" (UniqueName: \"kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.045853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.047619 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.047915 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.047944 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.048343 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.087650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp656\" (UniqueName: \"kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656\") pod \"dnsmasq-dns-785b8787c9-tnskb\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.189574 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.191526 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.202102 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.204953 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261184 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261266 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261319 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lv8q\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261467 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.261537 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364045 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lv8q\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364101 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364150 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364223 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364252 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364282 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364307 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.364328 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.365377 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.365608 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.371213 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.371657 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.371995 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.372598 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.389058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lv8q\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q\") pod \"glance-default-internal-api-0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.542264 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.673248 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:35 crc kubenswrapper[4895]: I1202 08:55:35.859082 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.153259 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.185564 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:36 crc kubenswrapper[4895]: W1202 08:55:36.196374 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd13fbd9_1640_45e7_ba4d_10c3e459e7f0.slice/crio-7ddf8d1545fcfd11e7975789b9182d1208020a4669fce575a17b3e450dfc0dfd WatchSource:0}: Error finding container 7ddf8d1545fcfd11e7975789b9182d1208020a4669fce575a17b3e450dfc0dfd: Status 404 returned error can't find the container with id 7ddf8d1545fcfd11e7975789b9182d1208020a4669fce575a17b3e450dfc0dfd Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.331148 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerStarted","Data":"cf81f9d222d74a59f30d04e449276cb55da374d05e22b05a56f78a7520392516"} Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.335321 4895 generic.go:334] "Generic (PLEG): container finished" podID="c20caf23-05bb-4108-aed7-9a676667d36c" containerID="e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f" exitCode=0 Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.335391 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" event={"ID":"c20caf23-05bb-4108-aed7-9a676667d36c","Type":"ContainerDied","Data":"e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f"} Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.335457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" event={"ID":"c20caf23-05bb-4108-aed7-9a676667d36c","Type":"ContainerStarted","Data":"69ae9baf5b61d075b05ca6e4ff6c947c238a5ef85b32611531e814bd16d6260a"} Dec 02 08:55:36 crc kubenswrapper[4895]: I1202 08:55:36.340260 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerStarted","Data":"7ddf8d1545fcfd11e7975789b9182d1208020a4669fce575a17b3e450dfc0dfd"} Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.357228 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" event={"ID":"c20caf23-05bb-4108-aed7-9a676667d36c","Type":"ContainerStarted","Data":"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a"} Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.357793 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.365730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerStarted","Data":"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284"} Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.368595 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerStarted","Data":"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a"} Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.368647 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerStarted","Data":"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611"} Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.368775 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-log" containerID="cri-o://498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" gracePeriod=30 Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.368982 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-httpd" containerID="cri-o://32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" gracePeriod=30 Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.389822 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" podStartSLOduration=3.389791363 podStartE2EDuration="3.389791363s" podCreationTimestamp="2025-12-02 08:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:37.379508024 +0000 UTC m=+5548.550367637" watchObservedRunningTime="2025-12-02 08:55:37.389791363 +0000 UTC m=+5548.560650976" Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.415719 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.415692799 podStartE2EDuration="3.415692799s" podCreationTimestamp="2025-12-02 08:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:37.407957038 +0000 UTC m=+5548.578816671" watchObservedRunningTime="2025-12-02 08:55:37.415692799 +0000 UTC m=+5548.586552412" Dec 02 08:55:37 crc kubenswrapper[4895]: I1202 08:55:37.986124 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122454 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122523 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhf96\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122544 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122570 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122615 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122681 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.122820 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data\") pod \"05476608-670a-4a7c-8e69-43c131014737\" (UID: \"05476608-670a-4a7c-8e69-43c131014737\") " Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.123871 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs" (OuterVolumeSpecName: "logs") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.124530 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.133823 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts" (OuterVolumeSpecName: "scripts") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.135666 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph" (OuterVolumeSpecName: "ceph") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.137260 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96" (OuterVolumeSpecName: "kube-api-access-hhf96") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "kube-api-access-hhf96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.158897 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.176830 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data" (OuterVolumeSpecName: "config-data") pod "05476608-670a-4a7c-8e69-43c131014737" (UID: "05476608-670a-4a7c-8e69-43c131014737"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.225761 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.225889 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.225978 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.226053 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.226132 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhf96\" (UniqueName: \"kubernetes.io/projected/05476608-670a-4a7c-8e69-43c131014737-kube-api-access-hhf96\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.226211 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05476608-670a-4a7c-8e69-43c131014737-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.226282 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05476608-670a-4a7c-8e69-43c131014737-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.365184 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.387046 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerStarted","Data":"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500"} Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.391603 4895 generic.go:334] "Generic (PLEG): container finished" podID="05476608-670a-4a7c-8e69-43c131014737" containerID="32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" exitCode=143 Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.391807 4895 generic.go:334] "Generic (PLEG): container finished" podID="05476608-670a-4a7c-8e69-43c131014737" containerID="498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" exitCode=143 Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.393166 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.394974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerDied","Data":"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a"} Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.395023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerDied","Data":"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611"} Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.395037 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"05476608-670a-4a7c-8e69-43c131014737","Type":"ContainerDied","Data":"cf81f9d222d74a59f30d04e449276cb55da374d05e22b05a56f78a7520392516"} Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.395054 4895 scope.go:117] "RemoveContainer" containerID="32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.422382 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.422348666 podStartE2EDuration="3.422348666s" podCreationTimestamp="2025-12-02 08:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:38.409942391 +0000 UTC m=+5549.580802004" watchObservedRunningTime="2025-12-02 08:55:38.422348666 +0000 UTC m=+5549.593208289" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.435693 4895 scope.go:117] "RemoveContainer" containerID="498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.436923 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.443918 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.467996 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:38 crc kubenswrapper[4895]: E1202 08:55:38.468405 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-log" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.468424 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-log" Dec 02 08:55:38 crc kubenswrapper[4895]: E1202 08:55:38.468438 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-httpd" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.468445 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-httpd" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.468655 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-httpd" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.468680 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="05476608-670a-4a7c-8e69-43c131014737" containerName="glance-log" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.470452 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.473318 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.514115 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.515528 4895 scope.go:117] "RemoveContainer" containerID="32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" Dec 02 08:55:38 crc kubenswrapper[4895]: E1202 08:55:38.516157 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a\": container with ID starting with 32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a not found: ID does not exist" containerID="32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.516208 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a"} err="failed to get container status \"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a\": rpc error: code = NotFound desc = could not find container \"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a\": container with ID starting with 32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a not found: ID does not exist" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.516243 4895 scope.go:117] "RemoveContainer" containerID="498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" Dec 02 08:55:38 crc kubenswrapper[4895]: E1202 08:55:38.516575 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611\": container with ID starting with 498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611 not found: ID does not exist" containerID="498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.516608 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611"} err="failed to get container status \"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611\": rpc error: code = NotFound desc = could not find container \"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611\": container with ID starting with 498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611 not found: ID does not exist" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.516631 4895 scope.go:117] "RemoveContainer" containerID="32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.517054 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a"} err="failed to get container status \"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a\": rpc error: code = NotFound desc = could not find container \"32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a\": container with ID starting with 32f14bad0b5555de1926be31e055047b7a450f09200a9c7a10722c34da8c363a not found: ID does not exist" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.517075 4895 scope.go:117] "RemoveContainer" containerID="498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.517288 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611"} err="failed to get container status \"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611\": rpc error: code = NotFound desc = could not find container \"498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611\": container with ID starting with 498a7499b57ca19b3f689465cacc251cd0266b455ea52d81bd3ce7d0d038e611 not found: ID does not exist" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636138 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636230 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8flz\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636299 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636335 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.636364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.737754 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.737871 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.738711 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.738774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.738845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8flz\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.738922 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.739012 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.739595 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.739662 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.745983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.746082 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.749355 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.753961 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.757808 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8flz\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz\") pod \"glance-default-external-api-0\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " pod="openstack/glance-default-external-api-0" Dec 02 08:55:38 crc kubenswrapper[4895]: I1202 08:55:38.813418 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:55:39 crc kubenswrapper[4895]: I1202 08:55:39.151592 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05476608-670a-4a7c-8e69-43c131014737" path="/var/lib/kubelet/pods/05476608-670a-4a7c-8e69-43c131014737/volumes" Dec 02 08:55:39 crc kubenswrapper[4895]: I1202 08:55:39.384797 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:55:39 crc kubenswrapper[4895]: I1202 08:55:39.418227 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerStarted","Data":"1a7bbe60536d3d5d4eb8933eb461adebc6a2b62b76d6bb90727b5b313ea906f1"} Dec 02 08:55:39 crc kubenswrapper[4895]: I1202 08:55:39.423562 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-httpd" containerID="cri-o://bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" gracePeriod=30 Dec 02 08:55:39 crc kubenswrapper[4895]: I1202 08:55:39.423558 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-log" containerID="cri-o://9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" gracePeriod=30 Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.106225 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.268324 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.268573 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.268643 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.268702 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.269694 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs" (OuterVolumeSpecName: "logs") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.269823 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lv8q\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.269924 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.271896 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run\") pod \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\" (UID: \"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0\") " Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.272255 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.273132 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.273159 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.274125 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q" (OuterVolumeSpecName: "kube-api-access-4lv8q") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "kube-api-access-4lv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.275901 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts" (OuterVolumeSpecName: "scripts") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.276065 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph" (OuterVolumeSpecName: "ceph") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.307103 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.329470 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data" (OuterVolumeSpecName: "config-data") pod "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" (UID: "cd13fbd9-1640-45e7-ba4d-10c3e459e7f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.375780 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.375826 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.376280 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.376319 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lv8q\" (UniqueName: \"kubernetes.io/projected/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-kube-api-access-4lv8q\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.376329 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436604 4895 generic.go:334] "Generic (PLEG): container finished" podID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerID="bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" exitCode=0 Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436641 4895 generic.go:334] "Generic (PLEG): container finished" podID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerID="9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" exitCode=143 Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerDied","Data":"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500"} Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436732 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerDied","Data":"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284"} Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436754 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cd13fbd9-1640-45e7-ba4d-10c3e459e7f0","Type":"ContainerDied","Data":"7ddf8d1545fcfd11e7975789b9182d1208020a4669fce575a17b3e450dfc0dfd"} Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436679 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.436771 4895 scope.go:117] "RemoveContainer" containerID="bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.440320 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerStarted","Data":"a467d7fd4acf073b3008fe8d8d3c8690e72c4989f5c7e9a09a9d89db485e1d74"} Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.483835 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.484231 4895 scope.go:117] "RemoveContainer" containerID="9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.488256 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.516647 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:40 crc kubenswrapper[4895]: E1202 08:55:40.517312 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-log" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.517335 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-log" Dec 02 08:55:40 crc kubenswrapper[4895]: E1202 08:55:40.517345 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-httpd" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.517353 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-httpd" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.517544 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-httpd" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.517573 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" containerName="glance-log" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.518913 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.523107 4895 scope.go:117] "RemoveContainer" containerID="bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" Dec 02 08:55:40 crc kubenswrapper[4895]: E1202 08:55:40.523973 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500\": container with ID starting with bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500 not found: ID does not exist" containerID="bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.524021 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500"} err="failed to get container status \"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500\": rpc error: code = NotFound desc = could not find container \"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500\": container with ID starting with bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500 not found: ID does not exist" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.524052 4895 scope.go:117] "RemoveContainer" containerID="9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.524414 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 08:55:40 crc kubenswrapper[4895]: E1202 08:55:40.524968 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284\": container with ID starting with 9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284 not found: ID does not exist" containerID="9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.524994 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284"} err="failed to get container status \"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284\": rpc error: code = NotFound desc = could not find container \"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284\": container with ID starting with 9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284 not found: ID does not exist" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.525015 4895 scope.go:117] "RemoveContainer" containerID="bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.525233 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500"} err="failed to get container status \"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500\": rpc error: code = NotFound desc = could not find container \"bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500\": container with ID starting with bbda25dbf82d3494ba54881f6510563aa5d56766a86eb75b342513c1a7082500 not found: ID does not exist" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.525256 4895 scope.go:117] "RemoveContainer" containerID="9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.525497 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284"} err="failed to get container status \"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284\": rpc error: code = NotFound desc = could not find container \"9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284\": container with ID starting with 9222eac2b3a031fd84ac2b38ed60980d683ee35081210b4355e9c9df1f165284 not found: ID does not exist" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.546031 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690049 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690498 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690529 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzttl\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690606 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.690645 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792078 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzttl\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792186 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792241 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792280 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792317 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.792821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.793906 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.799686 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.800323 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.800712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.800935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.816669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzttl\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl\") pod \"glance-default-internal-api-0\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:55:40 crc kubenswrapper[4895]: I1202 08:55:40.846831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:41 crc kubenswrapper[4895]: I1202 08:55:41.154269 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd13fbd9-1640-45e7-ba4d-10c3e459e7f0" path="/var/lib/kubelet/pods/cd13fbd9-1640-45e7-ba4d-10c3e459e7f0/volumes" Dec 02 08:55:41 crc kubenswrapper[4895]: I1202 08:55:41.439012 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:55:41 crc kubenswrapper[4895]: W1202 08:55:41.441875 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod773bc693_f07d_4938_8980_3099a7dbc5dd.slice/crio-cd8945dc3b543224fde2d78afdbbec6fb4ee5d8f98340ca7e2432d547ad0a045 WatchSource:0}: Error finding container cd8945dc3b543224fde2d78afdbbec6fb4ee5d8f98340ca7e2432d547ad0a045: Status 404 returned error can't find the container with id cd8945dc3b543224fde2d78afdbbec6fb4ee5d8f98340ca7e2432d547ad0a045 Dec 02 08:55:41 crc kubenswrapper[4895]: I1202 08:55:41.451338 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerStarted","Data":"da1c3a95269074d1f09180dc4ae3943a96d109e287418ba82bc8eb967de3da9b"} Dec 02 08:55:41 crc kubenswrapper[4895]: I1202 08:55:41.491083 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.491059505 podStartE2EDuration="3.491059505s" podCreationTimestamp="2025-12-02 08:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:41.476629145 +0000 UTC m=+5552.647488778" watchObservedRunningTime="2025-12-02 08:55:41.491059505 +0000 UTC m=+5552.661919118" Dec 02 08:55:42 crc kubenswrapper[4895]: I1202 08:55:42.468307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerStarted","Data":"aa831bd876d7a69f0f63864e7275c900c7bd4478df5baff23116f39ba661ed6a"} Dec 02 08:55:42 crc kubenswrapper[4895]: I1202 08:55:42.468662 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerStarted","Data":"cd8945dc3b543224fde2d78afdbbec6fb4ee5d8f98340ca7e2432d547ad0a045"} Dec 02 08:55:43 crc kubenswrapper[4895]: I1202 08:55:43.480460 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerStarted","Data":"35d0b895398752a7799047c2a0e5d3eb54f05528de7ecc20484d047f9950c08a"} Dec 02 08:55:43 crc kubenswrapper[4895]: I1202 08:55:43.518241 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5182087600000003 podStartE2EDuration="3.51820876s" podCreationTimestamp="2025-12-02 08:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:55:43.501037495 +0000 UTC m=+5554.671897118" watchObservedRunningTime="2025-12-02 08:55:43.51820876 +0000 UTC m=+5554.689068403" Dec 02 08:55:45 crc kubenswrapper[4895]: I1202 08:55:45.366732 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:55:45 crc kubenswrapper[4895]: I1202 08:55:45.437171 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:55:45 crc kubenswrapper[4895]: I1202 08:55:45.437462 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="dnsmasq-dns" containerID="cri-o://d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e" gracePeriod=10 Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.464507 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.528393 4895 generic.go:334] "Generic (PLEG): container finished" podID="343e44b9-20db-43e6-8b44-c78a6159b631" containerID="d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e" exitCode=0 Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.528452 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" event={"ID":"343e44b9-20db-43e6-8b44-c78a6159b631","Type":"ContainerDied","Data":"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e"} Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.528493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" event={"ID":"343e44b9-20db-43e6-8b44-c78a6159b631","Type":"ContainerDied","Data":"cc40b8d7aab4a10b6a3fd3878a73fee6788e3fc2a5833b8fbaaeaa96ec68a320"} Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.528525 4895 scope.go:117] "RemoveContainer" containerID="d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.528574 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dcccd5c7-mwdjx" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.553550 4895 scope.go:117] "RemoveContainer" containerID="a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.575898 4895 scope.go:117] "RemoveContainer" containerID="d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e" Dec 02 08:55:46 crc kubenswrapper[4895]: E1202 08:55:46.576393 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e\": container with ID starting with d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e not found: ID does not exist" containerID="d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.576426 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e"} err="failed to get container status \"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e\": rpc error: code = NotFound desc = could not find container \"d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e\": container with ID starting with d37019dad4cb761c10e4bfbfcf77a1111d2a9cfdc6cc4d6d8e8d5c933336921e not found: ID does not exist" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.576447 4895 scope.go:117] "RemoveContainer" containerID="a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18" Dec 02 08:55:46 crc kubenswrapper[4895]: E1202 08:55:46.576827 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18\": container with ID starting with a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18 not found: ID does not exist" containerID="a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.576889 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18"} err="failed to get container status \"a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18\": rpc error: code = NotFound desc = could not find container \"a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18\": container with ID starting with a9219e6f2c3e5cff8935dc4203f4f99ac8ff7da48f062bfb653addad1cf80a18 not found: ID does not exist" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.611387 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.611552 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.611691 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gprm\" (UniqueName: \"kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.611965 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.612007 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.618238 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm" (OuterVolumeSpecName: "kube-api-access-2gprm") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631"). InnerVolumeSpecName "kube-api-access-2gprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.654433 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.655931 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config" (OuterVolumeSpecName: "config") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:46 crc kubenswrapper[4895]: E1202 08:55:46.663447 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb podName:343e44b9-20db-43e6-8b44-c78a6159b631 nodeName:}" failed. No retries permitted until 2025-12-02 08:55:47.163399907 +0000 UTC m=+5558.334259550 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631") : error deleting /var/lib/kubelet/pods/343e44b9-20db-43e6-8b44-c78a6159b631/volume-subpaths: remove /var/lib/kubelet/pods/343e44b9-20db-43e6-8b44-c78a6159b631/volume-subpaths: no such file or directory Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.663811 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.714482 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.714526 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.714539 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:46 crc kubenswrapper[4895]: I1202 08:55:46.714553 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gprm\" (UniqueName: \"kubernetes.io/projected/343e44b9-20db-43e6-8b44-c78a6159b631-kube-api-access-2gprm\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:47 crc kubenswrapper[4895]: I1202 08:55:47.223604 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") pod \"343e44b9-20db-43e6-8b44-c78a6159b631\" (UID: \"343e44b9-20db-43e6-8b44-c78a6159b631\") " Dec 02 08:55:47 crc kubenswrapper[4895]: I1202 08:55:47.224648 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "343e44b9-20db-43e6-8b44-c78a6159b631" (UID: "343e44b9-20db-43e6-8b44-c78a6159b631"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:55:47 crc kubenswrapper[4895]: I1202 08:55:47.326273 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/343e44b9-20db-43e6-8b44-c78a6159b631-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:47 crc kubenswrapper[4895]: I1202 08:55:47.476977 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:55:47 crc kubenswrapper[4895]: I1202 08:55:47.488170 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64dcccd5c7-mwdjx"] Dec 02 08:55:48 crc kubenswrapper[4895]: I1202 08:55:48.813982 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:55:48 crc kubenswrapper[4895]: I1202 08:55:48.814471 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:55:48 crc kubenswrapper[4895]: I1202 08:55:48.845431 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:55:48 crc kubenswrapper[4895]: I1202 08:55:48.865202 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:55:49 crc kubenswrapper[4895]: I1202 08:55:49.153608 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" path="/var/lib/kubelet/pods/343e44b9-20db-43e6-8b44-c78a6159b631/volumes" Dec 02 08:55:49 crc kubenswrapper[4895]: I1202 08:55:49.559787 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:55:49 crc kubenswrapper[4895]: I1202 08:55:49.560149 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:55:50 crc kubenswrapper[4895]: I1202 08:55:50.848554 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:50 crc kubenswrapper[4895]: I1202 08:55:50.848640 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:50 crc kubenswrapper[4895]: I1202 08:55:50.878485 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:50 crc kubenswrapper[4895]: I1202 08:55:50.900389 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:51 crc kubenswrapper[4895]: I1202 08:55:51.575897 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:51 crc kubenswrapper[4895]: I1202 08:55:51.575951 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:51 crc kubenswrapper[4895]: I1202 08:55:51.583495 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:55:51 crc kubenswrapper[4895]: I1202 08:55:51.583618 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:55:51 crc kubenswrapper[4895]: I1202 08:55:51.609662 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:55:53 crc kubenswrapper[4895]: I1202 08:55:53.672169 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:53 crc kubenswrapper[4895]: I1202 08:55:53.672623 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:55:53 crc kubenswrapper[4895]: I1202 08:55:53.729888 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.656932 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-nmzg7"] Dec 02 08:55:59 crc kubenswrapper[4895]: E1202 08:55:59.658030 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="init" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.658046 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="init" Dec 02 08:55:59 crc kubenswrapper[4895]: E1202 08:55:59.658062 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="dnsmasq-dns" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.658069 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="dnsmasq-dns" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.658259 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="343e44b9-20db-43e6-8b44-c78a6159b631" containerName="dnsmasq-dns" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.658905 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.665804 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-nmzg7"] Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.667431 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66hbc\" (UniqueName: \"kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.667531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.758247 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3f33-account-create-update-rlk7v"] Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.780898 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.787809 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3f33-account-create-update-rlk7v"] Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.789200 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.793005 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66hbc\" (UniqueName: \"kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.793253 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.796689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.815752 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66hbc\" (UniqueName: \"kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc\") pod \"placement-db-create-nmzg7\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.894970 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.895048 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwhb\" (UniqueName: \"kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.980076 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nmzg7" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.997300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.997375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwhb\" (UniqueName: \"kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:55:59 crc kubenswrapper[4895]: I1202 08:55:59.998642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.022782 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwhb\" (UniqueName: \"kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb\") pod \"placement-3f33-account-create-update-rlk7v\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.105896 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.506690 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-nmzg7"] Dec 02 08:56:00 crc kubenswrapper[4895]: W1202 08:56:00.507400 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59d4c29_3b2b_4966_afcf_beb3f0ea1502.slice/crio-1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7 WatchSource:0}: Error finding container 1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7: Status 404 returned error can't find the container with id 1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7 Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.612758 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3f33-account-create-update-rlk7v"] Dec 02 08:56:00 crc kubenswrapper[4895]: W1202 08:56:00.618787 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dca9eb0_0064_474c_864d_2a3ee5a37609.slice/crio-dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4 WatchSource:0}: Error finding container dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4: Status 404 returned error can't find the container with id dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4 Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.717728 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nmzg7" event={"ID":"c59d4c29-3b2b-4966-afcf-beb3f0ea1502","Type":"ContainerStarted","Data":"7f396d433321705302823b9107bbacfefd5ae4d6825c0b6f6e0a00cfe6dc7c8b"} Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.717853 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nmzg7" event={"ID":"c59d4c29-3b2b-4966-afcf-beb3f0ea1502","Type":"ContainerStarted","Data":"1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7"} Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.720538 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f33-account-create-update-rlk7v" event={"ID":"1dca9eb0-0064-474c-864d-2a3ee5a37609","Type":"ContainerStarted","Data":"dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4"} Dec 02 08:56:00 crc kubenswrapper[4895]: I1202 08:56:00.740823 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-nmzg7" podStartSLOduration=1.7407991680000001 podStartE2EDuration="1.740799168s" podCreationTimestamp="2025-12-02 08:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:56:00.733718608 +0000 UTC m=+5571.904578211" watchObservedRunningTime="2025-12-02 08:56:00.740799168 +0000 UTC m=+5571.911658801" Dec 02 08:56:01 crc kubenswrapper[4895]: I1202 08:56:01.736119 4895 generic.go:334] "Generic (PLEG): container finished" podID="1dca9eb0-0064-474c-864d-2a3ee5a37609" containerID="ac39b4ea1f3525107544fb98140ed2c16e32d35c021325ebe0a6fe774e827d2e" exitCode=0 Dec 02 08:56:01 crc kubenswrapper[4895]: I1202 08:56:01.736238 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f33-account-create-update-rlk7v" event={"ID":"1dca9eb0-0064-474c-864d-2a3ee5a37609","Type":"ContainerDied","Data":"ac39b4ea1f3525107544fb98140ed2c16e32d35c021325ebe0a6fe774e827d2e"} Dec 02 08:56:01 crc kubenswrapper[4895]: I1202 08:56:01.741202 4895 generic.go:334] "Generic (PLEG): container finished" podID="c59d4c29-3b2b-4966-afcf-beb3f0ea1502" containerID="7f396d433321705302823b9107bbacfefd5ae4d6825c0b6f6e0a00cfe6dc7c8b" exitCode=0 Dec 02 08:56:01 crc kubenswrapper[4895]: I1202 08:56:01.741405 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nmzg7" event={"ID":"c59d4c29-3b2b-4966-afcf-beb3f0ea1502","Type":"ContainerDied","Data":"7f396d433321705302823b9107bbacfefd5ae4d6825c0b6f6e0a00cfe6dc7c8b"} Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.155222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nmzg7" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.162407 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.289843 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwhb\" (UniqueName: \"kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb\") pod \"1dca9eb0-0064-474c-864d-2a3ee5a37609\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.289931 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts\") pod \"1dca9eb0-0064-474c-864d-2a3ee5a37609\" (UID: \"1dca9eb0-0064-474c-864d-2a3ee5a37609\") " Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.290014 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66hbc\" (UniqueName: \"kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc\") pod \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.290042 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts\") pod \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\" (UID: \"c59d4c29-3b2b-4966-afcf-beb3f0ea1502\") " Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.290783 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dca9eb0-0064-474c-864d-2a3ee5a37609" (UID: "1dca9eb0-0064-474c-864d-2a3ee5a37609"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.290790 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c59d4c29-3b2b-4966-afcf-beb3f0ea1502" (UID: "c59d4c29-3b2b-4966-afcf-beb3f0ea1502"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.295907 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc" (OuterVolumeSpecName: "kube-api-access-66hbc") pod "c59d4c29-3b2b-4966-afcf-beb3f0ea1502" (UID: "c59d4c29-3b2b-4966-afcf-beb3f0ea1502"). InnerVolumeSpecName "kube-api-access-66hbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.295923 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb" (OuterVolumeSpecName: "kube-api-access-cfwhb") pod "1dca9eb0-0064-474c-864d-2a3ee5a37609" (UID: "1dca9eb0-0064-474c-864d-2a3ee5a37609"). InnerVolumeSpecName "kube-api-access-cfwhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.392208 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwhb\" (UniqueName: \"kubernetes.io/projected/1dca9eb0-0064-474c-864d-2a3ee5a37609-kube-api-access-cfwhb\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.392239 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dca9eb0-0064-474c-864d-2a3ee5a37609-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.392249 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66hbc\" (UniqueName: \"kubernetes.io/projected/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-kube-api-access-66hbc\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.392260 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c59d4c29-3b2b-4966-afcf-beb3f0ea1502-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.782774 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nmzg7" event={"ID":"c59d4c29-3b2b-4966-afcf-beb3f0ea1502","Type":"ContainerDied","Data":"1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7"} Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.782837 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c22f7b7d10c6c54f370638c2e0ddf21cbf78e2598fbdc58eb8bfb5e943bbec7" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.782906 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nmzg7" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.790136 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3f33-account-create-update-rlk7v" event={"ID":"1dca9eb0-0064-474c-864d-2a3ee5a37609","Type":"ContainerDied","Data":"dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4"} Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.790192 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfbd1697a3a07d48d045f5a48ff8d0ce1449d5b4b216a9f3edd2227fc603d1e4" Dec 02 08:56:03 crc kubenswrapper[4895]: I1202 08:56:03.790265 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3f33-account-create-update-rlk7v" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.049711 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:56:05 crc kubenswrapper[4895]: E1202 08:56:05.050460 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dca9eb0-0064-474c-864d-2a3ee5a37609" containerName="mariadb-account-create-update" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.050474 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dca9eb0-0064-474c-864d-2a3ee5a37609" containerName="mariadb-account-create-update" Dec 02 08:56:05 crc kubenswrapper[4895]: E1202 08:56:05.050493 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59d4c29-3b2b-4966-afcf-beb3f0ea1502" containerName="mariadb-database-create" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.050500 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59d4c29-3b2b-4966-afcf-beb3f0ea1502" containerName="mariadb-database-create" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.052458 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dca9eb0-0064-474c-864d-2a3ee5a37609" containerName="mariadb-account-create-update" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.052523 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59d4c29-3b2b-4966-afcf-beb3f0ea1502" containerName="mariadb-database-create" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.054109 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.067821 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.119893 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h7zhh"] Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.122049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.126678 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.126778 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.126778 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5zkxs" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.131924 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h7zhh"] Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.226637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.226685 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.226753 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227197 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227230 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227286 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccsqs\" (UniqueName: \"kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.227359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329009 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329065 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccsqs\" (UniqueName: \"kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329114 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329201 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329259 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329284 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329303 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.329334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.330696 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.330720 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.330778 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.330807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.330777 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.335095 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.335491 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.336469 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.356337 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn\") pod \"placement-db-sync-h7zhh\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.358226 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccsqs\" (UniqueName: \"kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs\") pod \"dnsmasq-dns-76dbf459b5-tdz8d\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.386142 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.443291 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.883572 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:56:05 crc kubenswrapper[4895]: I1202 08:56:05.934512 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h7zhh"] Dec 02 08:56:05 crc kubenswrapper[4895]: W1202 08:56:05.940365 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f1ce383_de6c_4845_8aa1_d97f8057fd90.slice/crio-20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5 WatchSource:0}: Error finding container 20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5: Status 404 returned error can't find the container with id 20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5 Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.821681 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7zhh" event={"ID":"1f1ce383-de6c-4845-8aa1-d97f8057fd90","Type":"ContainerStarted","Data":"89cf5bc2f3e3a6102e5f96662b60004fe0576e6ba2cc20d8bb44190d4ad4b432"} Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.822021 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7zhh" event={"ID":"1f1ce383-de6c-4845-8aa1-d97f8057fd90","Type":"ContainerStarted","Data":"20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5"} Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.823684 4895 generic.go:334] "Generic (PLEG): container finished" podID="e49de284-87ac-44c3-b301-99b4a5262b56" containerID="df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4" exitCode=0 Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.823709 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" event={"ID":"e49de284-87ac-44c3-b301-99b4a5262b56","Type":"ContainerDied","Data":"df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4"} Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.823724 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" event={"ID":"e49de284-87ac-44c3-b301-99b4a5262b56","Type":"ContainerStarted","Data":"ff305cd230451bba484642633a9a6546ee9ae1a2aff0aee8f7cb4dc347b64fa2"} Dec 02 08:56:06 crc kubenswrapper[4895]: I1202 08:56:06.849137 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h7zhh" podStartSLOduration=1.8491163990000001 podStartE2EDuration="1.849116399s" podCreationTimestamp="2025-12-02 08:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:56:06.839597423 +0000 UTC m=+5578.010457046" watchObservedRunningTime="2025-12-02 08:56:06.849116399 +0000 UTC m=+5578.019976012" Dec 02 08:56:07 crc kubenswrapper[4895]: I1202 08:56:07.836439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" event={"ID":"e49de284-87ac-44c3-b301-99b4a5262b56","Type":"ContainerStarted","Data":"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f"} Dec 02 08:56:07 crc kubenswrapper[4895]: I1202 08:56:07.836817 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:07 crc kubenswrapper[4895]: I1202 08:56:07.839069 4895 generic.go:334] "Generic (PLEG): container finished" podID="1f1ce383-de6c-4845-8aa1-d97f8057fd90" containerID="89cf5bc2f3e3a6102e5f96662b60004fe0576e6ba2cc20d8bb44190d4ad4b432" exitCode=0 Dec 02 08:56:07 crc kubenswrapper[4895]: I1202 08:56:07.839097 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7zhh" event={"ID":"1f1ce383-de6c-4845-8aa1-d97f8057fd90","Type":"ContainerDied","Data":"89cf5bc2f3e3a6102e5f96662b60004fe0576e6ba2cc20d8bb44190d4ad4b432"} Dec 02 08:56:07 crc kubenswrapper[4895]: I1202 08:56:07.864904 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" podStartSLOduration=2.864879039 podStartE2EDuration="2.864879039s" podCreationTimestamp="2025-12-02 08:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:56:07.855528849 +0000 UTC m=+5579.026388482" watchObservedRunningTime="2025-12-02 08:56:07.864879039 +0000 UTC m=+5579.035738672" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.178654 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.335845 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts\") pod \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.335922 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs\") pod \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.336016 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data\") pod \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.336050 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle\") pod \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.336187 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn\") pod \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\" (UID: \"1f1ce383-de6c-4845-8aa1-d97f8057fd90\") " Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.336300 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs" (OuterVolumeSpecName: "logs") pod "1f1ce383-de6c-4845-8aa1-d97f8057fd90" (UID: "1f1ce383-de6c-4845-8aa1-d97f8057fd90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.336612 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f1ce383-de6c-4845-8aa1-d97f8057fd90-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.341367 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts" (OuterVolumeSpecName: "scripts") pod "1f1ce383-de6c-4845-8aa1-d97f8057fd90" (UID: "1f1ce383-de6c-4845-8aa1-d97f8057fd90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.345961 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn" (OuterVolumeSpecName: "kube-api-access-grbkn") pod "1f1ce383-de6c-4845-8aa1-d97f8057fd90" (UID: "1f1ce383-de6c-4845-8aa1-d97f8057fd90"). InnerVolumeSpecName "kube-api-access-grbkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.361925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data" (OuterVolumeSpecName: "config-data") pod "1f1ce383-de6c-4845-8aa1-d97f8057fd90" (UID: "1f1ce383-de6c-4845-8aa1-d97f8057fd90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.370282 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f1ce383-de6c-4845-8aa1-d97f8057fd90" (UID: "1f1ce383-de6c-4845-8aa1-d97f8057fd90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.437945 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.437982 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.437994 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f1ce383-de6c-4845-8aa1-d97f8057fd90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.438010 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbkn\" (UniqueName: \"kubernetes.io/projected/1f1ce383-de6c-4845-8aa1-d97f8057fd90-kube-api-access-grbkn\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.863876 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h7zhh" event={"ID":"1f1ce383-de6c-4845-8aa1-d97f8057fd90","Type":"ContainerDied","Data":"20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5"} Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.863921 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20fe16baf78ee7f8c8dbe0bfd92d53ad4c2cc26cf34446ec22b1b65f73ea6cb5" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.863940 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h7zhh" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.940921 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-857448b6bd-q99p5"] Dec 02 08:56:09 crc kubenswrapper[4895]: E1202 08:56:09.941580 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ce383-de6c-4845-8aa1-d97f8057fd90" containerName="placement-db-sync" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.941622 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ce383-de6c-4845-8aa1-d97f8057fd90" containerName="placement-db-sync" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.941978 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1ce383-de6c-4845-8aa1-d97f8057fd90" containerName="placement-db-sync" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.943763 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.948375 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.948458 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.948605 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5zkxs" Dec 02 08:56:09 crc kubenswrapper[4895]: I1202 08:56:09.952201 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-857448b6bd-q99p5"] Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.049840 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897xp\" (UniqueName: \"kubernetes.io/projected/d07803fb-bcf8-4411-9f7e-b2ca58361b51-kube-api-access-897xp\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.050025 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-combined-ca-bundle\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.050122 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-config-data\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.050177 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07803fb-bcf8-4411-9f7e-b2ca58361b51-logs\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.050365 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-scripts\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.152452 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897xp\" (UniqueName: \"kubernetes.io/projected/d07803fb-bcf8-4411-9f7e-b2ca58361b51-kube-api-access-897xp\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.152532 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-combined-ca-bundle\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.152574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-config-data\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.152605 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07803fb-bcf8-4411-9f7e-b2ca58361b51-logs\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.152672 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-scripts\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.153835 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07803fb-bcf8-4411-9f7e-b2ca58361b51-logs\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.156968 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-scripts\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.157189 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-combined-ca-bundle\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.158775 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07803fb-bcf8-4411-9f7e-b2ca58361b51-config-data\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.168775 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897xp\" (UniqueName: \"kubernetes.io/projected/d07803fb-bcf8-4411-9f7e-b2ca58361b51-kube-api-access-897xp\") pod \"placement-857448b6bd-q99p5\" (UID: \"d07803fb-bcf8-4411-9f7e-b2ca58361b51\") " pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.267530 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.747042 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-857448b6bd-q99p5"] Dec 02 08:56:10 crc kubenswrapper[4895]: I1202 08:56:10.879210 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-857448b6bd-q99p5" event={"ID":"d07803fb-bcf8-4411-9f7e-b2ca58361b51","Type":"ContainerStarted","Data":"e12c5f1a3dd3b0d1d7f38247ab92b80b7cc269b2b8785d7f1d78cec7fee10caa"} Dec 02 08:56:11 crc kubenswrapper[4895]: I1202 08:56:11.894355 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-857448b6bd-q99p5" event={"ID":"d07803fb-bcf8-4411-9f7e-b2ca58361b51","Type":"ContainerStarted","Data":"0d37e7a57cd7759a05d6f9f74e108d63c6ba378d3ff4c2eaee39a013e1047e97"} Dec 02 08:56:11 crc kubenswrapper[4895]: I1202 08:56:11.894903 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-857448b6bd-q99p5" event={"ID":"d07803fb-bcf8-4411-9f7e-b2ca58361b51","Type":"ContainerStarted","Data":"6437cca795d1da6bc117b190ac32d436f5f818ce5d56f67b73e1f087f85af686"} Dec 02 08:56:11 crc kubenswrapper[4895]: I1202 08:56:11.895186 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:11 crc kubenswrapper[4895]: I1202 08:56:11.895242 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:11 crc kubenswrapper[4895]: I1202 08:56:11.919134 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-857448b6bd-q99p5" podStartSLOduration=2.919115477 podStartE2EDuration="2.919115477s" podCreationTimestamp="2025-12-02 08:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:56:11.914024539 +0000 UTC m=+5583.084884162" watchObservedRunningTime="2025-12-02 08:56:11.919115477 +0000 UTC m=+5583.089975090" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.388393 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.454699 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.455009 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="dnsmasq-dns" containerID="cri-o://13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a" gracePeriod=10 Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.889857 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.936697 4895 generic.go:334] "Generic (PLEG): container finished" podID="c20caf23-05bb-4108-aed7-9a676667d36c" containerID="13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a" exitCode=0 Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.936790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" event={"ID":"c20caf23-05bb-4108-aed7-9a676667d36c","Type":"ContainerDied","Data":"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a"} Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.936826 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" event={"ID":"c20caf23-05bb-4108-aed7-9a676667d36c","Type":"ContainerDied","Data":"69ae9baf5b61d075b05ca6e4ff6c947c238a5ef85b32611531e814bd16d6260a"} Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.936845 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785b8787c9-tnskb" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.936853 4895 scope.go:117] "RemoveContainer" containerID="13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.956564 4895 scope.go:117] "RemoveContainer" containerID="e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.979676 4895 scope.go:117] "RemoveContainer" containerID="13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a" Dec 02 08:56:15 crc kubenswrapper[4895]: E1202 08:56:15.980276 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a\": container with ID starting with 13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a not found: ID does not exist" containerID="13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.980331 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a"} err="failed to get container status \"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a\": rpc error: code = NotFound desc = could not find container \"13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a\": container with ID starting with 13afd0326088506fe5546c30e7ee9680958bf3d0b45c6f1aed060c39ebbc4e4a not found: ID does not exist" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.980365 4895 scope.go:117] "RemoveContainer" containerID="e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f" Dec 02 08:56:15 crc kubenswrapper[4895]: E1202 08:56:15.980711 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f\": container with ID starting with e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f not found: ID does not exist" containerID="e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f" Dec 02 08:56:15 crc kubenswrapper[4895]: I1202 08:56:15.980779 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f"} err="failed to get container status \"e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f\": rpc error: code = NotFound desc = could not find container \"e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f\": container with ID starting with e73a7fd66a9efc1214b35f4a424e697a207170d3d4064befb540d27073d2cb7f not found: ID does not exist" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.069250 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb\") pod \"c20caf23-05bb-4108-aed7-9a676667d36c\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.069327 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config\") pod \"c20caf23-05bb-4108-aed7-9a676667d36c\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.069396 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb\") pod \"c20caf23-05bb-4108-aed7-9a676667d36c\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.069423 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc\") pod \"c20caf23-05bb-4108-aed7-9a676667d36c\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.069503 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp656\" (UniqueName: \"kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656\") pod \"c20caf23-05bb-4108-aed7-9a676667d36c\" (UID: \"c20caf23-05bb-4108-aed7-9a676667d36c\") " Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.075191 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656" (OuterVolumeSpecName: "kube-api-access-gp656") pod "c20caf23-05bb-4108-aed7-9a676667d36c" (UID: "c20caf23-05bb-4108-aed7-9a676667d36c"). InnerVolumeSpecName "kube-api-access-gp656". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.114126 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c20caf23-05bb-4108-aed7-9a676667d36c" (UID: "c20caf23-05bb-4108-aed7-9a676667d36c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.118672 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c20caf23-05bb-4108-aed7-9a676667d36c" (UID: "c20caf23-05bb-4108-aed7-9a676667d36c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.121589 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config" (OuterVolumeSpecName: "config") pod "c20caf23-05bb-4108-aed7-9a676667d36c" (UID: "c20caf23-05bb-4108-aed7-9a676667d36c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.123116 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c20caf23-05bb-4108-aed7-9a676667d36c" (UID: "c20caf23-05bb-4108-aed7-9a676667d36c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.171724 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.172056 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.172068 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.172078 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp656\" (UniqueName: \"kubernetes.io/projected/c20caf23-05bb-4108-aed7-9a676667d36c-kube-api-access-gp656\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.172088 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c20caf23-05bb-4108-aed7-9a676667d36c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.272823 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:56:16 crc kubenswrapper[4895]: I1202 08:56:16.281087 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785b8787c9-tnskb"] Dec 02 08:56:17 crc kubenswrapper[4895]: I1202 08:56:17.152432 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" path="/var/lib/kubelet/pods/c20caf23-05bb-4108-aed7-9a676667d36c/volumes" Dec 02 08:56:29 crc kubenswrapper[4895]: I1202 08:56:29.694825 4895 scope.go:117] "RemoveContainer" containerID="60d04769a21fc7f67db7408c348add2dd04715b258ae02675f19d22715307928" Dec 02 08:56:29 crc kubenswrapper[4895]: I1202 08:56:29.722334 4895 scope.go:117] "RemoveContainer" containerID="681615d548d5595f7ec8bb18dd9cab1a7c2d65ef97f90121bfe2acdef2bf07b9" Dec 02 08:56:41 crc kubenswrapper[4895]: I1202 08:56:41.331484 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:56:41 crc kubenswrapper[4895]: I1202 08:56:41.332518 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-857448b6bd-q99p5" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.049984 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rwhl6"] Dec 02 08:57:02 crc kubenswrapper[4895]: E1202 08:57:02.051244 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="dnsmasq-dns" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.051265 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="dnsmasq-dns" Dec 02 08:57:02 crc kubenswrapper[4895]: E1202 08:57:02.051285 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="init" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.051294 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="init" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.051548 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20caf23-05bb-4108-aed7-9a676667d36c" containerName="dnsmasq-dns" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.052471 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.061515 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rwhl6"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.142859 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-86jd8"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.145325 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.153372 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-86jd8"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.227368 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxr9\" (UniqueName: \"kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.227475 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.251547 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p5cfg"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.252896 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.268957 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b117-account-create-update-t7r7w"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.270422 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.277105 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p5cfg"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.277887 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.294830 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b117-account-create-update-t7r7w"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.329515 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxr9\" (UniqueName: \"kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.330078 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.330207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqfnm\" (UniqueName: \"kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.330401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.331989 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.374588 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxr9\" (UniqueName: \"kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9\") pod \"nova-api-db-create-rwhl6\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.392547 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.435643 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l444q\" (UniqueName: \"kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.435708 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.435959 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.436226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.436417 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqfnm\" (UniqueName: \"kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.436456 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqw5z\" (UniqueName: \"kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.437346 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.466360 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqfnm\" (UniqueName: \"kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm\") pod \"nova-cell0-db-create-86jd8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.470003 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.491422 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ac5f-account-create-update-8qtm2"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.492854 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.495190 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.505707 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ac5f-account-create-update-8qtm2"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.539288 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.539653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqw5z\" (UniqueName: \"kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.539699 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l444q\" (UniqueName: \"kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.539718 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.540630 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.541167 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.599382 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l444q\" (UniqueName: \"kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q\") pod \"nova-api-b117-account-create-update-t7r7w\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.604721 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.606534 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqw5z\" (UniqueName: \"kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z\") pod \"nova-cell1-db-create-p5cfg\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.645592 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tl4w\" (UniqueName: \"kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.645693 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.688185 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-55cd-account-create-update-vbfqv"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.689963 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.693623 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.697823 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-55cd-account-create-update-vbfqv"] Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.748936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tl4w\" (UniqueName: \"kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.749043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgm8\" (UniqueName: \"kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.749087 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.749168 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.752488 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.767933 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tl4w\" (UniqueName: \"kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w\") pod \"nova-cell0-ac5f-account-create-update-8qtm2\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.851230 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.851368 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgm8\" (UniqueName: \"kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.852358 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.869329 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.876245 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgm8\" (UniqueName: \"kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8\") pod \"nova-cell1-55cd-account-create-update-vbfqv\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.967379 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:02 crc kubenswrapper[4895]: I1202 08:57:02.978709 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rwhl6"] Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.026208 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.157285 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-86jd8"] Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.216752 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b117-account-create-update-t7r7w"] Dec 02 08:57:03 crc kubenswrapper[4895]: W1202 08:57:03.221114 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebef79d_8c1e_4f62_b362_268f7d459291.slice/crio-3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb WatchSource:0}: Error finding container 3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb: Status 404 returned error can't find the container with id 3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.420529 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86jd8" event={"ID":"53745490-f6e4-4f78-964b-5a52444211b8","Type":"ContainerStarted","Data":"b4d87c37a7d9389373be96d04762a5f793229346c643cbec811a5ba2a2a9ac6f"} Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.422344 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b117-account-create-update-t7r7w" event={"ID":"febef79d-8c1e-4f62-b362-268f7d459291","Type":"ContainerStarted","Data":"3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb"} Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.424073 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rwhl6" event={"ID":"2c2079ee-4b91-4755-8f76-9a57e60b27ba","Type":"ContainerStarted","Data":"01778b44a5637b1b452f9e8d62db32b8f175d3bcdab0f73044cb4e91880d7d5a"} Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.424105 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rwhl6" event={"ID":"2c2079ee-4b91-4755-8f76-9a57e60b27ba","Type":"ContainerStarted","Data":"e70d722e85bea33f6fd2b74b645057509dcd3b2e023eac895876359a6a3e7263"} Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.450288 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-rwhl6" podStartSLOduration=1.450267121 podStartE2EDuration="1.450267121s" podCreationTimestamp="2025-12-02 08:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:03.447134463 +0000 UTC m=+5634.617994076" watchObservedRunningTime="2025-12-02 08:57:03.450267121 +0000 UTC m=+5634.621126734" Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.468784 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p5cfg"] Dec 02 08:57:03 crc kubenswrapper[4895]: W1202 08:57:03.468915 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53b6ca0c_81ea_4711_bc3c_d9a7a205543b.slice/crio-dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2 WatchSource:0}: Error finding container dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2: Status 404 returned error can't find the container with id dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2 Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.602306 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ac5f-account-create-update-8qtm2"] Dec 02 08:57:03 crc kubenswrapper[4895]: I1202 08:57:03.773148 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-55cd-account-create-update-vbfqv"] Dec 02 08:57:03 crc kubenswrapper[4895]: W1202 08:57:03.791273 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f04db4c_ba44_4d58_8471_7ad3abfc0eaf.slice/crio-5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0 WatchSource:0}: Error finding container 5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0: Status 404 returned error can't find the container with id 5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.434312 4895 generic.go:334] "Generic (PLEG): container finished" podID="53745490-f6e4-4f78-964b-5a52444211b8" containerID="5846aea8535c4c81f42aa817588d7c069916674c6ec58d53f95ba7e50e1afd69" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.434411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86jd8" event={"ID":"53745490-f6e4-4f78-964b-5a52444211b8","Type":"ContainerDied","Data":"5846aea8535c4c81f42aa817588d7c069916674c6ec58d53f95ba7e50e1afd69"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.435788 4895 generic.go:334] "Generic (PLEG): container finished" podID="febef79d-8c1e-4f62-b362-268f7d459291" containerID="474ca1e736651704b9b7179fb0341b9d7cea3973f5d792d97a2014d345e0e6ce" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.435862 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b117-account-create-update-t7r7w" event={"ID":"febef79d-8c1e-4f62-b362-268f7d459291","Type":"ContainerDied","Data":"474ca1e736651704b9b7179fb0341b9d7cea3973f5d792d97a2014d345e0e6ce"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.437577 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" containerID="33933149fc4e814612d9e8a55824e49dbc990abd5ad9a7dddba28905e0caf926" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.437631 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" event={"ID":"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf","Type":"ContainerDied","Data":"33933149fc4e814612d9e8a55824e49dbc990abd5ad9a7dddba28905e0caf926"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.437654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" event={"ID":"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf","Type":"ContainerStarted","Data":"5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.439837 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c2079ee-4b91-4755-8f76-9a57e60b27ba" containerID="01778b44a5637b1b452f9e8d62db32b8f175d3bcdab0f73044cb4e91880d7d5a" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.439882 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rwhl6" event={"ID":"2c2079ee-4b91-4755-8f76-9a57e60b27ba","Type":"ContainerDied","Data":"01778b44a5637b1b452f9e8d62db32b8f175d3bcdab0f73044cb4e91880d7d5a"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.441176 4895 generic.go:334] "Generic (PLEG): container finished" podID="53b6ca0c-81ea-4711-bc3c-d9a7a205543b" containerID="7c7141925d7ca869b099b95d2dc6329e23ecbb6e4665edad03d84b68ed82c03a" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.441217 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p5cfg" event={"ID":"53b6ca0c-81ea-4711-bc3c-d9a7a205543b","Type":"ContainerDied","Data":"7c7141925d7ca869b099b95d2dc6329e23ecbb6e4665edad03d84b68ed82c03a"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.441234 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p5cfg" event={"ID":"53b6ca0c-81ea-4711-bc3c-d9a7a205543b","Type":"ContainerStarted","Data":"dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.442416 4895 generic.go:334] "Generic (PLEG): container finished" podID="b8c143fa-5ab3-4e36-9da4-69095eedf045" containerID="37415d328d83cda2a36bd73d9ccf0307bda7503b24038509c10dc0bbd1d99c07" exitCode=0 Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.442453 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" event={"ID":"b8c143fa-5ab3-4e36-9da4-69095eedf045","Type":"ContainerDied","Data":"37415d328d83cda2a36bd73d9ccf0307bda7503b24038509c10dc0bbd1d99c07"} Dec 02 08:57:04 crc kubenswrapper[4895]: I1202 08:57:04.442473 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" event={"ID":"b8c143fa-5ab3-4e36-9da4-69095eedf045","Type":"ContainerStarted","Data":"503faf2b7cd71073c2342ae862b57e75453450d4345421096774fcc2a3f47a08"} Dec 02 08:57:05 crc kubenswrapper[4895]: I1202 08:57:05.474916 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:57:05 crc kubenswrapper[4895]: I1202 08:57:05.475466 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:57:05 crc kubenswrapper[4895]: I1202 08:57:05.836361 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.000987 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.013976 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.020326 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032025 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts\") pod \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032091 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts\") pod \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032203 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqw5z\" (UniqueName: \"kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z\") pod \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\" (UID: \"53b6ca0c-81ea-4711-bc3c-d9a7a205543b\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032282 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tl4w\" (UniqueName: \"kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w\") pod \"b8c143fa-5ab3-4e36-9da4-69095eedf045\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032323 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts\") pod \"b8c143fa-5ab3-4e36-9da4-69095eedf045\" (UID: \"b8c143fa-5ab3-4e36-9da4-69095eedf045\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l444q\" (UniqueName: \"kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q\") pod \"febef79d-8c1e-4f62-b362-268f7d459291\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032536 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts\") pod \"febef79d-8c1e-4f62-b362-268f7d459291\" (UID: \"febef79d-8c1e-4f62-b362-268f7d459291\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.032781 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxr9\" (UniqueName: \"kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9\") pod \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\" (UID: \"2c2079ee-4b91-4755-8f76-9a57e60b27ba\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.034358 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53b6ca0c-81ea-4711-bc3c-d9a7a205543b" (UID: "53b6ca0c-81ea-4711-bc3c-d9a7a205543b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.034725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c2079ee-4b91-4755-8f76-9a57e60b27ba" (UID: "2c2079ee-4b91-4755-8f76-9a57e60b27ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.035163 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8c143fa-5ab3-4e36-9da4-69095eedf045" (UID: "b8c143fa-5ab3-4e36-9da4-69095eedf045"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.035857 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2079ee-4b91-4755-8f76-9a57e60b27ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.035877 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.035889 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c143fa-5ab3-4e36-9da4-69095eedf045-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.036403 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "febef79d-8c1e-4f62-b362-268f7d459291" (UID: "febef79d-8c1e-4f62-b362-268f7d459291"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.040836 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z" (OuterVolumeSpecName: "kube-api-access-cqw5z") pod "53b6ca0c-81ea-4711-bc3c-d9a7a205543b" (UID: "53b6ca0c-81ea-4711-bc3c-d9a7a205543b"). InnerVolumeSpecName "kube-api-access-cqw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.040980 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q" (OuterVolumeSpecName: "kube-api-access-l444q") pod "febef79d-8c1e-4f62-b362-268f7d459291" (UID: "febef79d-8c1e-4f62-b362-268f7d459291"). InnerVolumeSpecName "kube-api-access-l444q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.042370 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.042809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9" (OuterVolumeSpecName: "kube-api-access-vpxr9") pod "2c2079ee-4b91-4755-8f76-9a57e60b27ba" (UID: "2c2079ee-4b91-4755-8f76-9a57e60b27ba"). InnerVolumeSpecName "kube-api-access-vpxr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.044020 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w" (OuterVolumeSpecName: "kube-api-access-6tl4w") pod "b8c143fa-5ab3-4e36-9da4-69095eedf045" (UID: "b8c143fa-5ab3-4e36-9da4-69095eedf045"). InnerVolumeSpecName "kube-api-access-6tl4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.044557 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.136975 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts\") pod \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137036 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqfnm\" (UniqueName: \"kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm\") pod \"53745490-f6e4-4f78-964b-5a52444211b8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137097 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgm8\" (UniqueName: \"kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8\") pod \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\" (UID: \"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137171 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts\") pod \"53745490-f6e4-4f78-964b-5a52444211b8\" (UID: \"53745490-f6e4-4f78-964b-5a52444211b8\") " Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137388 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxr9\" (UniqueName: \"kubernetes.io/projected/2c2079ee-4b91-4755-8f76-9a57e60b27ba-kube-api-access-vpxr9\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137405 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqw5z\" (UniqueName: \"kubernetes.io/projected/53b6ca0c-81ea-4711-bc3c-d9a7a205543b-kube-api-access-cqw5z\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137416 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tl4w\" (UniqueName: \"kubernetes.io/projected/b8c143fa-5ab3-4e36-9da4-69095eedf045-kube-api-access-6tl4w\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137426 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l444q\" (UniqueName: \"kubernetes.io/projected/febef79d-8c1e-4f62-b362-268f7d459291-kube-api-access-l444q\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137435 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/febef79d-8c1e-4f62-b362-268f7d459291-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137448 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" (UID: "3f04db4c-ba44-4d58-8471-7ad3abfc0eaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.137720 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53745490-f6e4-4f78-964b-5a52444211b8" (UID: "53745490-f6e4-4f78-964b-5a52444211b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.139946 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8" (OuterVolumeSpecName: "kube-api-access-qcgm8") pod "3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" (UID: "3f04db4c-ba44-4d58-8471-7ad3abfc0eaf"). InnerVolumeSpecName "kube-api-access-qcgm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.140261 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm" (OuterVolumeSpecName: "kube-api-access-zqfnm") pod "53745490-f6e4-4f78-964b-5a52444211b8" (UID: "53745490-f6e4-4f78-964b-5a52444211b8"). InnerVolumeSpecName "kube-api-access-zqfnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.241411 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53745490-f6e4-4f78-964b-5a52444211b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.241478 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.241496 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqfnm\" (UniqueName: \"kubernetes.io/projected/53745490-f6e4-4f78-964b-5a52444211b8-kube-api-access-zqfnm\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.241511 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcgm8\" (UniqueName: \"kubernetes.io/projected/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf-kube-api-access-qcgm8\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.462555 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-86jd8" event={"ID":"53745490-f6e4-4f78-964b-5a52444211b8","Type":"ContainerDied","Data":"b4d87c37a7d9389373be96d04762a5f793229346c643cbec811a5ba2a2a9ac6f"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.462597 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d87c37a7d9389373be96d04762a5f793229346c643cbec811a5ba2a2a9ac6f" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.462660 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-86jd8" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.467279 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b117-account-create-update-t7r7w" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.467359 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b117-account-create-update-t7r7w" event={"ID":"febef79d-8c1e-4f62-b362-268f7d459291","Type":"ContainerDied","Data":"3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.467422 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f147b036ccdb8e8e8122c5b06b88628a4b6f8458a1c5acaba5caa4dd72b78bb" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.471668 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.471658 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-55cd-account-create-update-vbfqv" event={"ID":"3f04db4c-ba44-4d58-8471-7ad3abfc0eaf","Type":"ContainerDied","Data":"5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.471807 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a71afba495b9657e08eef72ea39d8c6cd961fb1ec1ce471dc36ae94af3001d0" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.474266 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rwhl6" event={"ID":"2c2079ee-4b91-4755-8f76-9a57e60b27ba","Type":"ContainerDied","Data":"e70d722e85bea33f6fd2b74b645057509dcd3b2e023eac895876359a6a3e7263"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.474308 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e70d722e85bea33f6fd2b74b645057509dcd3b2e023eac895876359a6a3e7263" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.474413 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rwhl6" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.484564 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p5cfg" event={"ID":"53b6ca0c-81ea-4711-bc3c-d9a7a205543b","Type":"ContainerDied","Data":"dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.484809 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb975720fb7e25ec8ab1047ed7f7c008a258badac1166ba2144d21e06c44cc2" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.484845 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p5cfg" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.488682 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" event={"ID":"b8c143fa-5ab3-4e36-9da4-69095eedf045","Type":"ContainerDied","Data":"503faf2b7cd71073c2342ae862b57e75453450d4345421096774fcc2a3f47a08"} Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.488721 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="503faf2b7cd71073c2342ae862b57e75453450d4345421096774fcc2a3f47a08" Dec 02 08:57:06 crc kubenswrapper[4895]: I1202 08:57:06.488790 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ac5f-account-create-update-8qtm2" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.630311 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgzzz"] Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631153 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c143fa-5ab3-4e36-9da4-69095eedf045" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631170 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c143fa-5ab3-4e36-9da4-69095eedf045" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631192 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febef79d-8c1e-4f62-b362-268f7d459291" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631202 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="febef79d-8c1e-4f62-b362-268f7d459291" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631235 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2079ee-4b91-4755-8f76-9a57e60b27ba" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631253 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2079ee-4b91-4755-8f76-9a57e60b27ba" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631275 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b6ca0c-81ea-4711-bc3c-d9a7a205543b" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631282 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b6ca0c-81ea-4711-bc3c-d9a7a205543b" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631291 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53745490-f6e4-4f78-964b-5a52444211b8" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631298 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="53745490-f6e4-4f78-964b-5a52444211b8" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: E1202 08:57:07.631310 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631318 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631525 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="53745490-f6e4-4f78-964b-5a52444211b8" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631540 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2079ee-4b91-4755-8f76-9a57e60b27ba" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631552 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="febef79d-8c1e-4f62-b362-268f7d459291" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631567 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b6ca0c-81ea-4711-bc3c-d9a7a205543b" containerName="mariadb-database-create" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631586 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.631606 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c143fa-5ab3-4e36-9da4-69095eedf045" containerName="mariadb-account-create-update" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.632588 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.634934 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tpkqr" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.637033 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.637057 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.643495 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgzzz"] Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.668401 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4lc6\" (UniqueName: \"kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.668488 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.668598 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.668719 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.770618 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.770694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.770790 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.770837 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4lc6\" (UniqueName: \"kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.777307 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.777488 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.780378 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.789917 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4lc6\" (UniqueName: \"kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6\") pod \"nova-cell0-conductor-db-sync-qgzzz\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:07 crc kubenswrapper[4895]: I1202 08:57:07.949710 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:08 crc kubenswrapper[4895]: W1202 08:57:08.666863 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c9341c1_1d76_442c_b16e_6afcb266c131.slice/crio-e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0 WatchSource:0}: Error finding container e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0: Status 404 returned error can't find the container with id e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0 Dec 02 08:57:08 crc kubenswrapper[4895]: I1202 08:57:08.672201 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgzzz"] Dec 02 08:57:09 crc kubenswrapper[4895]: I1202 08:57:09.518706 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" event={"ID":"0c9341c1-1d76-442c-b16e-6afcb266c131","Type":"ContainerStarted","Data":"bff4156dcdd4edb34ab340cff93d5c0c7effff7e9e861e491111650ec66c6516"} Dec 02 08:57:09 crc kubenswrapper[4895]: I1202 08:57:09.519416 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" event={"ID":"0c9341c1-1d76-442c-b16e-6afcb266c131","Type":"ContainerStarted","Data":"e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0"} Dec 02 08:57:09 crc kubenswrapper[4895]: I1202 08:57:09.546314 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" podStartSLOduration=2.5462919 podStartE2EDuration="2.5462919s" podCreationTimestamp="2025-12-02 08:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:09.535392361 +0000 UTC m=+5640.706251984" watchObservedRunningTime="2025-12-02 08:57:09.5462919 +0000 UTC m=+5640.717151513" Dec 02 08:57:14 crc kubenswrapper[4895]: I1202 08:57:14.559255 4895 generic.go:334] "Generic (PLEG): container finished" podID="0c9341c1-1d76-442c-b16e-6afcb266c131" containerID="bff4156dcdd4edb34ab340cff93d5c0c7effff7e9e861e491111650ec66c6516" exitCode=0 Dec 02 08:57:14 crc kubenswrapper[4895]: I1202 08:57:14.559913 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" event={"ID":"0c9341c1-1d76-442c-b16e-6afcb266c131","Type":"ContainerDied","Data":"bff4156dcdd4edb34ab340cff93d5c0c7effff7e9e861e491111650ec66c6516"} Dec 02 08:57:15 crc kubenswrapper[4895]: I1202 08:57:15.929760 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.025522 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle\") pod \"0c9341c1-1d76-442c-b16e-6afcb266c131\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.025585 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts\") pod \"0c9341c1-1d76-442c-b16e-6afcb266c131\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.025626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4lc6\" (UniqueName: \"kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6\") pod \"0c9341c1-1d76-442c-b16e-6afcb266c131\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.025694 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data\") pod \"0c9341c1-1d76-442c-b16e-6afcb266c131\" (UID: \"0c9341c1-1d76-442c-b16e-6afcb266c131\") " Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.031198 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts" (OuterVolumeSpecName: "scripts") pod "0c9341c1-1d76-442c-b16e-6afcb266c131" (UID: "0c9341c1-1d76-442c-b16e-6afcb266c131"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.032011 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6" (OuterVolumeSpecName: "kube-api-access-t4lc6") pod "0c9341c1-1d76-442c-b16e-6afcb266c131" (UID: "0c9341c1-1d76-442c-b16e-6afcb266c131"). InnerVolumeSpecName "kube-api-access-t4lc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.053542 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c9341c1-1d76-442c-b16e-6afcb266c131" (UID: "0c9341c1-1d76-442c-b16e-6afcb266c131"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.053855 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data" (OuterVolumeSpecName: "config-data") pod "0c9341c1-1d76-442c-b16e-6afcb266c131" (UID: "0c9341c1-1d76-442c-b16e-6afcb266c131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.127707 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.128049 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.128118 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4lc6\" (UniqueName: \"kubernetes.io/projected/0c9341c1-1d76-442c-b16e-6afcb266c131-kube-api-access-t4lc6\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.128186 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9341c1-1d76-442c-b16e-6afcb266c131-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.580079 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" event={"ID":"0c9341c1-1d76-442c-b16e-6afcb266c131","Type":"ContainerDied","Data":"e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0"} Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.580124 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8e5520de08aca14b92d13a9aa34605e0419948b39c4d14436c1602aa96497a0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.580183 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgzzz" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.659360 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:57:16 crc kubenswrapper[4895]: E1202 08:57:16.659871 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9341c1-1d76-442c-b16e-6afcb266c131" containerName="nova-cell0-conductor-db-sync" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.659893 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9341c1-1d76-442c-b16e-6afcb266c131" containerName="nova-cell0-conductor-db-sync" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.660133 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9341c1-1d76-442c-b16e-6afcb266c131" containerName="nova-cell0-conductor-db-sync" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.660892 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.665632 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.671097 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tpkqr" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.676413 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.739384 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.739586 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jt5p\" (UniqueName: \"kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.739659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.840975 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.841119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.841190 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jt5p\" (UniqueName: \"kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.848134 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.848860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.858445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jt5p\" (UniqueName: \"kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p\") pod \"nova-cell0-conductor-0\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:16 crc kubenswrapper[4895]: I1202 08:57:16.983723 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:17 crc kubenswrapper[4895]: I1202 08:57:17.513418 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:57:17 crc kubenswrapper[4895]: I1202 08:57:17.606363 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b","Type":"ContainerStarted","Data":"390644f43272f26b31e3fe7c106c02342de67361ff24afb4bb8144ba9a3d471e"} Dec 02 08:57:18 crc kubenswrapper[4895]: I1202 08:57:18.615847 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b","Type":"ContainerStarted","Data":"a4b6862707d31377831d1999c1f773f9f1687d41cbe8a19027aa93249bfde329"} Dec 02 08:57:18 crc kubenswrapper[4895]: I1202 08:57:18.617241 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.008698 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.023419 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=11.023402396 podStartE2EDuration="11.023402396s" podCreationTimestamp="2025-12-02 08:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:18.64911417 +0000 UTC m=+5649.819973793" watchObservedRunningTime="2025-12-02 08:57:27.023402396 +0000 UTC m=+5658.194262009" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.452277 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mkdft"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.453962 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.455777 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.456022 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.466040 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mkdft"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.549578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c965d\" (UniqueName: \"kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.549802 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.549848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.549872 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.588025 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.615523 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.624587 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652283 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652309 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652327 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652395 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c965d\" (UniqueName: \"kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652457 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.652508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsw97\" (UniqueName: \"kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.664029 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.667143 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.667553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.684930 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.686627 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.690794 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.698769 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c965d\" (UniqueName: \"kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.718201 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data\") pod \"nova-cell0-cell-mapping-mkdft\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.734660 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.753925 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754184 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754287 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsw97\" (UniqueName: \"kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754382 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754471 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svs65\" (UniqueName: \"kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754590 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.754707 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.755309 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.762822 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.765763 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.777181 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.790481 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsw97\" (UniqueName: \"kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97\") pod \"nova-api-0\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.799812 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.801241 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.807220 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.818711 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vqvg\" (UniqueName: \"kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858451 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858485 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858512 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858605 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.858652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svs65\" (UniqueName: \"kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.862656 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.870815 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.873227 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.882242 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.882450 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.891466 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.918400 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.927492 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.939452 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svs65\" (UniqueName: \"kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65\") pod \"nova-cell1-novncproxy-0\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.940216 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.944963 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.952929 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.960964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961023 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961061 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961192 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961289 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4mk\" (UniqueName: \"kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961354 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrpg\" (UniqueName: \"kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.961395 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vqvg\" (UniqueName: \"kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.969805 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.984509 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:27 crc kubenswrapper[4895]: I1202 08:57:27.992101 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vqvg\" (UniqueName: \"kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg\") pod \"nova-scheduler-0\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.062992 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063338 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063369 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063390 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063406 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4mk\" (UniqueName: \"kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063432 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrpg\" (UniqueName: \"kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063510 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063561 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.063601 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.064854 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.066155 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.066633 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.068140 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.068251 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.068824 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.070438 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.085451 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrpg\" (UniqueName: \"kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg\") pod \"dnsmasq-dns-6bc9f46b9f-5tt2h\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.087329 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4mk\" (UniqueName: \"kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk\") pod \"nova-metadata-0\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.266546 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.302033 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.318802 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.542097 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mkdft"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.581508 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m4pzc"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.582933 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.586205 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.589700 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.603645 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m4pzc"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.679731 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.679812 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5qw\" (UniqueName: \"kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.679867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.679894 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.685438 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.701814 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.749044 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76e3c0b2-3994-476b-aead-29d0c1a7d7ce","Type":"ContainerStarted","Data":"ed3eb02a7c2ac8de716f6454be97406e6ca225e61aeb7e863e298edb937e7f0b"} Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.750392 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mkdft" event={"ID":"9bf5d46f-feea-4549-ad6c-3bf285b528ff","Type":"ContainerStarted","Data":"d042a04c10f34b0e62304bcbf410e7f007fac22899c65b9ee96a171953457324"} Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.757147 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerStarted","Data":"2f206566df3687f8b0707258a08e30f8d776d13376e72a18f5887cbb92477645"} Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.782753 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.782817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5qw\" (UniqueName: \"kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.782888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.782928 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.787902 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.793212 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.794639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.809921 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5qw\" (UniqueName: \"kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw\") pod \"nova-cell1-conductor-db-sync-m4pzc\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.856464 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:28 crc kubenswrapper[4895]: I1202 08:57:28.910508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.037971 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.052905 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.697809 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m4pzc"] Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.774457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerStarted","Data":"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.774853 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerStarted","Data":"3894fe927fac5ba70804789d53a9b6bca1f0f4c68e6c228f3a2e629347f73781"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.777262 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0790d1b7-ff03-4e24-b039-c28f9ed62bc9","Type":"ContainerStarted","Data":"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.777411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0790d1b7-ff03-4e24-b039-c28f9ed62bc9","Type":"ContainerStarted","Data":"a49208fb9e1582663e592fa8d117741d4806653f06ef852fceaf19a878ecef10"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.780256 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mkdft" event={"ID":"9bf5d46f-feea-4549-ad6c-3bf285b528ff","Type":"ContainerStarted","Data":"cd8e6ec08927f3cbf7386adde680b90d0ef08c316bfaa14fc6145f460b789bba"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.794055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerStarted","Data":"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.794108 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerStarted","Data":"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.800133 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.80011506 podStartE2EDuration="2.80011506s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:29.795643331 +0000 UTC m=+5660.966502934" watchObservedRunningTime="2025-12-02 08:57:29.80011506 +0000 UTC m=+5660.970974673" Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.812939 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76e3c0b2-3994-476b-aead-29d0c1a7d7ce","Type":"ContainerStarted","Data":"7e9d737b175822cdc45bd8e52baeb99570dfc11e6cd8cd418de28a8a4ee982c7"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.822873 4895 generic.go:334] "Generic (PLEG): container finished" podID="d76b17de-f566-4259-b60e-95a57cb3a975" containerID="2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d" exitCode=0 Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.823000 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" event={"ID":"d76b17de-f566-4259-b60e-95a57cb3a975","Type":"ContainerDied","Data":"2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.823306 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" event={"ID":"d76b17de-f566-4259-b60e-95a57cb3a975","Type":"ContainerStarted","Data":"7c555fa887686e788a5df3e67726685b5c66bb0c66c71632470e4497e32a4b24"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.838173 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mkdft" podStartSLOduration=2.838145673 podStartE2EDuration="2.838145673s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:29.819252315 +0000 UTC m=+5660.990111948" watchObservedRunningTime="2025-12-02 08:57:29.838145673 +0000 UTC m=+5661.009005286" Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.852133 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.852108298 podStartE2EDuration="2.852108298s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:29.836321906 +0000 UTC m=+5661.007181529" watchObservedRunningTime="2025-12-02 08:57:29.852108298 +0000 UTC m=+5661.022967911" Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.864469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" event={"ID":"ddf2c8d7-9918-4162-86a3-68074211ecdb","Type":"ContainerStarted","Data":"c85f8a02377a17775fd87ae296d1c9c44f3fcde01024712c1a04d335b14d988c"} Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.873282 4895 scope.go:117] "RemoveContainer" containerID="39f727fa570af71bfb0738805b753dfa3fc3507325e32440311d039152c4d455" Dec 02 08:57:29 crc kubenswrapper[4895]: I1202 08:57:29.895470 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.895448706 podStartE2EDuration="2.895448706s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:29.876087233 +0000 UTC m=+5661.046946856" watchObservedRunningTime="2025-12-02 08:57:29.895448706 +0000 UTC m=+5661.066308329" Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.876940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" event={"ID":"d76b17de-f566-4259-b60e-95a57cb3a975","Type":"ContainerStarted","Data":"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9"} Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.877445 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.878787 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" event={"ID":"ddf2c8d7-9918-4162-86a3-68074211ecdb","Type":"ContainerStarted","Data":"08764e8907291fc0cd589e94ecb5de2e6a06891b5ff9971afcf38599b9f62c61"} Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.882578 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerStarted","Data":"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276"} Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.900064 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" podStartSLOduration=3.9000451590000003 podStartE2EDuration="3.900045159s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:30.897165689 +0000 UTC m=+5662.068025302" watchObservedRunningTime="2025-12-02 08:57:30.900045159 +0000 UTC m=+5662.070904782" Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.917966 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" podStartSLOduration=2.917893374 podStartE2EDuration="2.917893374s" podCreationTimestamp="2025-12-02 08:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:30.913260881 +0000 UTC m=+5662.084120514" watchObservedRunningTime="2025-12-02 08:57:30.917893374 +0000 UTC m=+5662.088752987" Dec 02 08:57:30 crc kubenswrapper[4895]: I1202 08:57:30.935671 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.935654187 podStartE2EDuration="3.935654187s" podCreationTimestamp="2025-12-02 08:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:30.933900062 +0000 UTC m=+5662.104759705" watchObservedRunningTime="2025-12-02 08:57:30.935654187 +0000 UTC m=+5662.106513810" Dec 02 08:57:32 crc kubenswrapper[4895]: I1202 08:57:32.940894 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:33 crc kubenswrapper[4895]: I1202 08:57:33.267578 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:57:33 crc kubenswrapper[4895]: I1202 08:57:33.303270 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:57:33 crc kubenswrapper[4895]: I1202 08:57:33.303361 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:57:33 crc kubenswrapper[4895]: I1202 08:57:33.920819 4895 generic.go:334] "Generic (PLEG): container finished" podID="ddf2c8d7-9918-4162-86a3-68074211ecdb" containerID="08764e8907291fc0cd589e94ecb5de2e6a06891b5ff9971afcf38599b9f62c61" exitCode=0 Dec 02 08:57:33 crc kubenswrapper[4895]: I1202 08:57:33.920880 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" event={"ID":"ddf2c8d7-9918-4162-86a3-68074211ecdb","Type":"ContainerDied","Data":"08764e8907291fc0cd589e94ecb5de2e6a06891b5ff9971afcf38599b9f62c61"} Dec 02 08:57:34 crc kubenswrapper[4895]: I1202 08:57:34.934304 4895 generic.go:334] "Generic (PLEG): container finished" podID="9bf5d46f-feea-4549-ad6c-3bf285b528ff" containerID="cd8e6ec08927f3cbf7386adde680b90d0ef08c316bfaa14fc6145f460b789bba" exitCode=0 Dec 02 08:57:34 crc kubenswrapper[4895]: I1202 08:57:34.934400 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mkdft" event={"ID":"9bf5d46f-feea-4549-ad6c-3bf285b528ff","Type":"ContainerDied","Data":"cd8e6ec08927f3cbf7386adde680b90d0ef08c316bfaa14fc6145f460b789bba"} Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.316868 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.460553 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle\") pod \"ddf2c8d7-9918-4162-86a3-68074211ecdb\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.460900 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts\") pod \"ddf2c8d7-9918-4162-86a3-68074211ecdb\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.460955 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv5qw\" (UniqueName: \"kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw\") pod \"ddf2c8d7-9918-4162-86a3-68074211ecdb\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.461060 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data\") pod \"ddf2c8d7-9918-4162-86a3-68074211ecdb\" (UID: \"ddf2c8d7-9918-4162-86a3-68074211ecdb\") " Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.466586 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts" (OuterVolumeSpecName: "scripts") pod "ddf2c8d7-9918-4162-86a3-68074211ecdb" (UID: "ddf2c8d7-9918-4162-86a3-68074211ecdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.467180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw" (OuterVolumeSpecName: "kube-api-access-fv5qw") pod "ddf2c8d7-9918-4162-86a3-68074211ecdb" (UID: "ddf2c8d7-9918-4162-86a3-68074211ecdb"). InnerVolumeSpecName "kube-api-access-fv5qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.473803 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.473869 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.488180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data" (OuterVolumeSpecName: "config-data") pod "ddf2c8d7-9918-4162-86a3-68074211ecdb" (UID: "ddf2c8d7-9918-4162-86a3-68074211ecdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.491913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddf2c8d7-9918-4162-86a3-68074211ecdb" (UID: "ddf2c8d7-9918-4162-86a3-68074211ecdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.564218 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.564534 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.564581 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv5qw\" (UniqueName: \"kubernetes.io/projected/ddf2c8d7-9918-4162-86a3-68074211ecdb-kube-api-access-fv5qw\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.564600 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf2c8d7-9918-4162-86a3-68074211ecdb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.946911 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" event={"ID":"ddf2c8d7-9918-4162-86a3-68074211ecdb","Type":"ContainerDied","Data":"c85f8a02377a17775fd87ae296d1c9c44f3fcde01024712c1a04d335b14d988c"} Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.946970 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85f8a02377a17775fd87ae296d1c9c44f3fcde01024712c1a04d335b14d988c" Dec 02 08:57:35 crc kubenswrapper[4895]: I1202 08:57:35.947006 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m4pzc" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.032708 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:57:36 crc kubenswrapper[4895]: E1202 08:57:36.035052 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf2c8d7-9918-4162-86a3-68074211ecdb" containerName="nova-cell1-conductor-db-sync" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.035088 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf2c8d7-9918-4162-86a3-68074211ecdb" containerName="nova-cell1-conductor-db-sync" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.035353 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf2c8d7-9918-4162-86a3-68074211ecdb" containerName="nova-cell1-conductor-db-sync" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.086188 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.099134 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.121025 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.183145 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.186163 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.186327 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28psx\" (UniqueName: \"kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.287967 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.288041 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28psx\" (UniqueName: \"kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.288150 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.296463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.308692 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.312903 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28psx\" (UniqueName: \"kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx\") pod \"nova-cell1-conductor-0\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.406583 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.424618 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.593172 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data\") pod \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.593737 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c965d\" (UniqueName: \"kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d\") pod \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.593838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts\") pod \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.593889 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle\") pod \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\" (UID: \"9bf5d46f-feea-4549-ad6c-3bf285b528ff\") " Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.599833 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts" (OuterVolumeSpecName: "scripts") pod "9bf5d46f-feea-4549-ad6c-3bf285b528ff" (UID: "9bf5d46f-feea-4549-ad6c-3bf285b528ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.613442 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d" (OuterVolumeSpecName: "kube-api-access-c965d") pod "9bf5d46f-feea-4549-ad6c-3bf285b528ff" (UID: "9bf5d46f-feea-4549-ad6c-3bf285b528ff"). InnerVolumeSpecName "kube-api-access-c965d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.622962 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data" (OuterVolumeSpecName: "config-data") pod "9bf5d46f-feea-4549-ad6c-3bf285b528ff" (UID: "9bf5d46f-feea-4549-ad6c-3bf285b528ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.642662 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bf5d46f-feea-4549-ad6c-3bf285b528ff" (UID: "9bf5d46f-feea-4549-ad6c-3bf285b528ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.696379 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c965d\" (UniqueName: \"kubernetes.io/projected/9bf5d46f-feea-4549-ad6c-3bf285b528ff-kube-api-access-c965d\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.696424 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.696435 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.696444 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bf5d46f-feea-4549-ad6c-3bf285b528ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.956254 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.958059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mkdft" event={"ID":"9bf5d46f-feea-4549-ad6c-3bf285b528ff","Type":"ContainerDied","Data":"d042a04c10f34b0e62304bcbf410e7f007fac22899c65b9ee96a171953457324"} Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.958100 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d042a04c10f34b0e62304bcbf410e7f007fac22899c65b9ee96a171953457324" Dec 02 08:57:36 crc kubenswrapper[4895]: I1202 08:57:36.958156 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mkdft" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.156607 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.157318 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-log" containerID="cri-o://a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" gracePeriod=30 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.157456 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-api" containerID="cri-o://58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" gracePeriod=30 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.170585 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.170892 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" containerName="nova-scheduler-scheduler" containerID="cri-o://23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45" gracePeriod=30 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.209953 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.210238 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-log" containerID="cri-o://64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" gracePeriod=30 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.210466 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-metadata" containerID="cri-o://8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" gracePeriod=30 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.658668 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.801336 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.826218 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data\") pod \"c6f59f1d-2249-429e-986b-496235570ca2\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.826287 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs\") pod \"c6f59f1d-2249-429e-986b-496235570ca2\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.826414 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsw97\" (UniqueName: \"kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97\") pod \"c6f59f1d-2249-429e-986b-496235570ca2\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.826471 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle\") pod \"c6f59f1d-2249-429e-986b-496235570ca2\" (UID: \"c6f59f1d-2249-429e-986b-496235570ca2\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.827121 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs" (OuterVolumeSpecName: "logs") pod "c6f59f1d-2249-429e-986b-496235570ca2" (UID: "c6f59f1d-2249-429e-986b-496235570ca2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.833336 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97" (OuterVolumeSpecName: "kube-api-access-lsw97") pod "c6f59f1d-2249-429e-986b-496235570ca2" (UID: "c6f59f1d-2249-429e-986b-496235570ca2"). InnerVolumeSpecName "kube-api-access-lsw97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.856237 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data" (OuterVolumeSpecName: "config-data") pod "c6f59f1d-2249-429e-986b-496235570ca2" (UID: "c6f59f1d-2249-429e-986b-496235570ca2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.872429 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6f59f1d-2249-429e-986b-496235570ca2" (UID: "c6f59f1d-2249-429e-986b-496235570ca2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.928447 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs\") pod \"05e0834d-d816-442d-b6de-66fe0ce1704a\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.928832 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data\") pod \"05e0834d-d816-442d-b6de-66fe0ce1704a\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.928875 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle\") pod \"05e0834d-d816-442d-b6de-66fe0ce1704a\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.928989 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh4mk\" (UniqueName: \"kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk\") pod \"05e0834d-d816-442d-b6de-66fe0ce1704a\" (UID: \"05e0834d-d816-442d-b6de-66fe0ce1704a\") " Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.929259 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs" (OuterVolumeSpecName: "logs") pod "05e0834d-d816-442d-b6de-66fe0ce1704a" (UID: "05e0834d-d816-442d-b6de-66fe0ce1704a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.929333 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsw97\" (UniqueName: \"kubernetes.io/projected/c6f59f1d-2249-429e-986b-496235570ca2-kube-api-access-lsw97\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.929350 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.929359 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6f59f1d-2249-429e-986b-496235570ca2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.929369 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6f59f1d-2249-429e-986b-496235570ca2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.932325 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk" (OuterVolumeSpecName: "kube-api-access-hh4mk") pod "05e0834d-d816-442d-b6de-66fe0ce1704a" (UID: "05e0834d-d816-442d-b6de-66fe0ce1704a"). InnerVolumeSpecName "kube-api-access-hh4mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.940922 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.954494 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.955165 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05e0834d-d816-442d-b6de-66fe0ce1704a" (UID: "05e0834d-d816-442d-b6de-66fe0ce1704a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.966966 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data" (OuterVolumeSpecName: "config-data") pod "05e0834d-d816-442d-b6de-66fe0ce1704a" (UID: "05e0834d-d816-442d-b6de-66fe0ce1704a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.971661 4895 generic.go:334] "Generic (PLEG): container finished" podID="c6f59f1d-2249-429e-986b-496235570ca2" containerID="58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" exitCode=0 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.971874 4895 generic.go:334] "Generic (PLEG): container finished" podID="c6f59f1d-2249-429e-986b-496235570ca2" containerID="a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" exitCode=143 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.972092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerDied","Data":"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.972202 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerDied","Data":"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.972299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c6f59f1d-2249-429e-986b-496235570ca2","Type":"ContainerDied","Data":"2f206566df3687f8b0707258a08e30f8d776d13376e72a18f5887cbb92477645"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.972394 4895 scope.go:117] "RemoveContainer" containerID="58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.972792 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984140 4895 generic.go:334] "Generic (PLEG): container finished" podID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerID="8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" exitCode=0 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984181 4895 generic.go:334] "Generic (PLEG): container finished" podID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerID="64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" exitCode=143 Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984234 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerDied","Data":"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984269 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerDied","Data":"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984283 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"05e0834d-d816-442d-b6de-66fe0ce1704a","Type":"ContainerDied","Data":"3894fe927fac5ba70804789d53a9b6bca1f0f4c68e6c228f3a2e629347f73781"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.984356 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.992382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cc4ee3e5-734a-43c6-86b5-8779253d5857","Type":"ContainerStarted","Data":"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.992441 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cc4ee3e5-734a-43c6-86b5-8779253d5857","Type":"ContainerStarted","Data":"9c072549133c766a309b1951b8178f18de784e8bfe0656a354f8676ae4a4f122"} Dec 02 08:57:37 crc kubenswrapper[4895]: I1202 08:57:37.992490 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.000739 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.010005 4895 scope.go:117] "RemoveContainer" containerID="a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.020938 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.020917541 podStartE2EDuration="2.020917541s" podCreationTimestamp="2025-12-02 08:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:38.014048837 +0000 UTC m=+5669.184908470" watchObservedRunningTime="2025-12-02 08:57:38.020917541 +0000 UTC m=+5669.191777154" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.032338 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e0834d-d816-442d-b6de-66fe0ce1704a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.032370 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.032380 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e0834d-d816-442d-b6de-66fe0ce1704a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.032390 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh4mk\" (UniqueName: \"kubernetes.io/projected/05e0834d-d816-442d-b6de-66fe0ce1704a-kube-api-access-hh4mk\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.062517 4895 scope.go:117] "RemoveContainer" containerID="58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.063340 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb\": container with ID starting with 58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb not found: ID does not exist" containerID="58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.063380 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb"} err="failed to get container status \"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb\": rpc error: code = NotFound desc = could not find container \"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb\": container with ID starting with 58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.063415 4895 scope.go:117] "RemoveContainer" containerID="a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.063646 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e\": container with ID starting with a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e not found: ID does not exist" containerID="a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.063671 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e"} err="failed to get container status \"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e\": rpc error: code = NotFound desc = could not find container \"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e\": container with ID starting with a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.063690 4895 scope.go:117] "RemoveContainer" containerID="58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.064079 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb"} err="failed to get container status \"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb\": rpc error: code = NotFound desc = could not find container \"58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb\": container with ID starting with 58855f0305fc651f474f5c03cd01086365d32dbfa7359a56aa632adc30282efb not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.064101 4895 scope.go:117] "RemoveContainer" containerID="a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.064364 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e"} err="failed to get container status \"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e\": rpc error: code = NotFound desc = could not find container \"a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e\": container with ID starting with a987d0f4eba145cc01e39438ddd34f2a8059cc02ad43e7ec5eb7cbc2e4e3143e not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.064388 4895 scope.go:117] "RemoveContainer" containerID="8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.128020 4895 scope.go:117] "RemoveContainer" containerID="64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.147284 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.176827 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.192837 4895 scope.go:117] "RemoveContainer" containerID="8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.197368 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.200193 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276\": container with ID starting with 8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276 not found: ID does not exist" containerID="8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.200247 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276"} err="failed to get container status \"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276\": rpc error: code = NotFound desc = could not find container \"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276\": container with ID starting with 8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276 not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.200276 4895 scope.go:117] "RemoveContainer" containerID="64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.204131 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e\": container with ID starting with 64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e not found: ID does not exist" containerID="64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.204181 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e"} err="failed to get container status \"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e\": rpc error: code = NotFound desc = could not find container \"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e\": container with ID starting with 64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.204217 4895 scope.go:117] "RemoveContainer" containerID="8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.209963 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276"} err="failed to get container status \"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276\": rpc error: code = NotFound desc = could not find container \"8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276\": container with ID starting with 8545688732abb57f6a0dce94bc0cc1b0822b82b017eb4b7da5ab2baaef557276 not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.210024 4895 scope.go:117] "RemoveContainer" containerID="64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.211092 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e"} err="failed to get container status \"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e\": rpc error: code = NotFound desc = could not find container \"64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e\": container with ID starting with 64c5b686c3e5f022fd627d5bea51b074ae760bec8345fc633b6c242890067f0e not found: ID does not exist" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232154 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.232700 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-api" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232726 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-api" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.232766 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-log" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232774 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-log" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.232791 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-metadata" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232801 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-metadata" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.232822 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-log" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232830 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-log" Dec 02 08:57:38 crc kubenswrapper[4895]: E1202 08:57:38.232852 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf5d46f-feea-4549-ad6c-3bf285b528ff" containerName="nova-manage" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.232857 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf5d46f-feea-4549-ad6c-3bf285b528ff" containerName="nova-manage" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.233035 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-api" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.233049 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf5d46f-feea-4549-ad6c-3bf285b528ff" containerName="nova-manage" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.233061 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f59f1d-2249-429e-986b-496235570ca2" containerName="nova-api-log" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.233078 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-log" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.233089 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" containerName="nova-metadata-metadata" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.234284 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.237001 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.252431 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.270465 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.283098 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.285173 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.288019 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.300500 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.325302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.356571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.356632 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7zs\" (UniqueName: \"kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.356668 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.356989 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.407529 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.407801 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="dnsmasq-dns" containerID="cri-o://7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f" gracePeriod=10 Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.461925 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462006 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqf78\" (UniqueName: \"kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462072 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462174 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462205 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462226 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7zs\" (UniqueName: \"kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462246 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.462269 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.464235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.470925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.490954 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7zs\" (UniqueName: \"kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.491579 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.563886 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqf78\" (UniqueName: \"kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.563949 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.564011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.564034 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.564517 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.569018 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.569241 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.584555 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.586648 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqf78\" (UniqueName: \"kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78\") pod \"nova-metadata-0\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.703428 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:38 crc kubenswrapper[4895]: I1202 08:57:38.913087 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.006423 4895 generic.go:334] "Generic (PLEG): container finished" podID="e49de284-87ac-44c3-b301-99b4a5262b56" containerID="7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f" exitCode=0 Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.006497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" event={"ID":"e49de284-87ac-44c3-b301-99b4a5262b56","Type":"ContainerDied","Data":"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f"} Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.006532 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" event={"ID":"e49de284-87ac-44c3-b301-99b4a5262b56","Type":"ContainerDied","Data":"ff305cd230451bba484642633a9a6546ee9ae1a2aff0aee8f7cb4dc347b64fa2"} Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.006555 4895 scope.go:117] "RemoveContainer" containerID="7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.006735 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76dbf459b5-tdz8d" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.032173 4895 scope.go:117] "RemoveContainer" containerID="df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.054571 4895 scope.go:117] "RemoveContainer" containerID="7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f" Dec 02 08:57:39 crc kubenswrapper[4895]: E1202 08:57:39.055476 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f\": container with ID starting with 7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f not found: ID does not exist" containerID="7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.055564 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f"} err="failed to get container status \"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f\": rpc error: code = NotFound desc = could not find container \"7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f\": container with ID starting with 7b2a88b4b9839a421519c4f70694368ee2ed93b9e60d060bf21eb060ee65025f not found: ID does not exist" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.055625 4895 scope.go:117] "RemoveContainer" containerID="df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4" Dec 02 08:57:39 crc kubenswrapper[4895]: E1202 08:57:39.056040 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4\": container with ID starting with df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4 not found: ID does not exist" containerID="df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.056066 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4"} err="failed to get container status \"df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4\": rpc error: code = NotFound desc = could not find container \"df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4\": container with ID starting with df24bd7f6ab5cb75849a83ad219183a79cab563758b94727f24ba721e730b6b4 not found: ID does not exist" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.075913 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc\") pod \"e49de284-87ac-44c3-b301-99b4a5262b56\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.076083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb\") pod \"e49de284-87ac-44c3-b301-99b4a5262b56\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.076136 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb\") pod \"e49de284-87ac-44c3-b301-99b4a5262b56\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.076294 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccsqs\" (UniqueName: \"kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs\") pod \"e49de284-87ac-44c3-b301-99b4a5262b56\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.076331 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config\") pod \"e49de284-87ac-44c3-b301-99b4a5262b56\" (UID: \"e49de284-87ac-44c3-b301-99b4a5262b56\") " Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.084728 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs" (OuterVolumeSpecName: "kube-api-access-ccsqs") pod "e49de284-87ac-44c3-b301-99b4a5262b56" (UID: "e49de284-87ac-44c3-b301-99b4a5262b56"). InnerVolumeSpecName "kube-api-access-ccsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.127214 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.137913 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config" (OuterVolumeSpecName: "config") pod "e49de284-87ac-44c3-b301-99b4a5262b56" (UID: "e49de284-87ac-44c3-b301-99b4a5262b56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.149064 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e49de284-87ac-44c3-b301-99b4a5262b56" (UID: "e49de284-87ac-44c3-b301-99b4a5262b56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.151668 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e49de284-87ac-44c3-b301-99b4a5262b56" (UID: "e49de284-87ac-44c3-b301-99b4a5262b56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.152976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e49de284-87ac-44c3-b301-99b4a5262b56" (UID: "e49de284-87ac-44c3-b301-99b4a5262b56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.165981 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e0834d-d816-442d-b6de-66fe0ce1704a" path="/var/lib/kubelet/pods/05e0834d-d816-442d-b6de-66fe0ce1704a/volumes" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.166929 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f59f1d-2249-429e-986b-496235570ca2" path="/var/lib/kubelet/pods/c6f59f1d-2249-429e-986b-496235570ca2/volumes" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.179230 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.179287 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.179300 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccsqs\" (UniqueName: \"kubernetes.io/projected/e49de284-87ac-44c3-b301-99b4a5262b56-kube-api-access-ccsqs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.179316 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.179346 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e49de284-87ac-44c3-b301-99b4a5262b56-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.227298 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.512244 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:57:39 crc kubenswrapper[4895]: I1202 08:57:39.522331 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76dbf459b5-tdz8d"] Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.021000 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerStarted","Data":"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.021044 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerStarted","Data":"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.021054 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerStarted","Data":"8b37daed33787e56c2ab14f7203bfe651a6cad50f71b8aa4029d15038f11878a"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.024557 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerStarted","Data":"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.025009 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerStarted","Data":"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.025078 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerStarted","Data":"e75c750f4ab4116abe6f3f0f296dc857fbabd652de25b4abfc89c52b20d24548"} Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.041465 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.04144699 podStartE2EDuration="2.04144699s" podCreationTimestamp="2025-12-02 08:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:40.039483359 +0000 UTC m=+5671.210342962" watchObservedRunningTime="2025-12-02 08:57:40.04144699 +0000 UTC m=+5671.212306603" Dec 02 08:57:40 crc kubenswrapper[4895]: I1202 08:57:40.068708 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.068680347 podStartE2EDuration="2.068680347s" podCreationTimestamp="2025-12-02 08:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:40.068429839 +0000 UTC m=+5671.239289452" watchObservedRunningTime="2025-12-02 08:57:40.068680347 +0000 UTC m=+5671.239539960" Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.166379 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" path="/var/lib/kubelet/pods/e49de284-87ac-44c3-b301-99b4a5262b56/volumes" Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.836033 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.933676 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle\") pod \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.933878 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data\") pod \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.933987 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vqvg\" (UniqueName: \"kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg\") pod \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\" (UID: \"0790d1b7-ff03-4e24-b039-c28f9ed62bc9\") " Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.941636 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg" (OuterVolumeSpecName: "kube-api-access-8vqvg") pod "0790d1b7-ff03-4e24-b039-c28f9ed62bc9" (UID: "0790d1b7-ff03-4e24-b039-c28f9ed62bc9"). InnerVolumeSpecName "kube-api-access-8vqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.966576 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0790d1b7-ff03-4e24-b039-c28f9ed62bc9" (UID: "0790d1b7-ff03-4e24-b039-c28f9ed62bc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:41 crc kubenswrapper[4895]: I1202 08:57:41.966799 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data" (OuterVolumeSpecName: "config-data") pod "0790d1b7-ff03-4e24-b039-c28f9ed62bc9" (UID: "0790d1b7-ff03-4e24-b039-c28f9ed62bc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.036365 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.036413 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.036431 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vqvg\" (UniqueName: \"kubernetes.io/projected/0790d1b7-ff03-4e24-b039-c28f9ed62bc9-kube-api-access-8vqvg\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.043663 4895 generic.go:334] "Generic (PLEG): container finished" podID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" containerID="23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45" exitCode=0 Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.043714 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0790d1b7-ff03-4e24-b039-c28f9ed62bc9","Type":"ContainerDied","Data":"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45"} Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.043759 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0790d1b7-ff03-4e24-b039-c28f9ed62bc9","Type":"ContainerDied","Data":"a49208fb9e1582663e592fa8d117741d4806653f06ef852fceaf19a878ecef10"} Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.043783 4895 scope.go:117] "RemoveContainer" containerID="23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.043925 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.081138 4895 scope.go:117] "RemoveContainer" containerID="23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45" Dec 02 08:57:42 crc kubenswrapper[4895]: E1202 08:57:42.081786 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45\": container with ID starting with 23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45 not found: ID does not exist" containerID="23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.081859 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45"} err="failed to get container status \"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45\": rpc error: code = NotFound desc = could not find container \"23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45\": container with ID starting with 23c9ae2afaa444aea336f65ad1affa0d3ef2119339daa6dfad28b8fbd7bb7c45 not found: ID does not exist" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.085939 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.110351 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.126865 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:42 crc kubenswrapper[4895]: E1202 08:57:42.127410 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="dnsmasq-dns" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.127428 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="dnsmasq-dns" Dec 02 08:57:42 crc kubenswrapper[4895]: E1202 08:57:42.127469 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" containerName="nova-scheduler-scheduler" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.127478 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" containerName="nova-scheduler-scheduler" Dec 02 08:57:42 crc kubenswrapper[4895]: E1202 08:57:42.127490 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="init" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.127497 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="init" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.127684 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49de284-87ac-44c3-b301-99b4a5262b56" containerName="dnsmasq-dns" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.127700 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" containerName="nova-scheduler-scheduler" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.128501 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.131875 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.139338 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.241245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.241319 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9p7h\" (UniqueName: \"kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.241378 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.343496 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.343573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9p7h\" (UniqueName: \"kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.343631 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.347907 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.348269 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.362765 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9p7h\" (UniqueName: \"kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h\") pod \"nova-scheduler-0\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.450398 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:57:42 crc kubenswrapper[4895]: I1202 08:57:42.911980 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:43 crc kubenswrapper[4895]: I1202 08:57:43.054369 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f53a4dc-0378-4727-a729-b7d520e28874","Type":"ContainerStarted","Data":"05807c785822ee010894c5005d9a1dfce3045b15a7f219c89e08dee43a9a9a4b"} Dec 02 08:57:43 crc kubenswrapper[4895]: I1202 08:57:43.154231 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0790d1b7-ff03-4e24-b039-c28f9ed62bc9" path="/var/lib/kubelet/pods/0790d1b7-ff03-4e24-b039-c28f9ed62bc9/volumes" Dec 02 08:57:43 crc kubenswrapper[4895]: I1202 08:57:43.704938 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:57:43 crc kubenswrapper[4895]: I1202 08:57:43.706180 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:57:44 crc kubenswrapper[4895]: I1202 08:57:44.090308 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f53a4dc-0378-4727-a729-b7d520e28874","Type":"ContainerStarted","Data":"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b"} Dec 02 08:57:44 crc kubenswrapper[4895]: I1202 08:57:44.124032 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.124004279 podStartE2EDuration="2.124004279s" podCreationTimestamp="2025-12-02 08:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:44.120097197 +0000 UTC m=+5675.290956810" watchObservedRunningTime="2025-12-02 08:57:44.124004279 +0000 UTC m=+5675.294863882" Dec 02 08:57:46 crc kubenswrapper[4895]: I1202 08:57:46.461948 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.017366 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z52x8"] Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.019295 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.021397 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.022382 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.030812 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z52x8"] Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.070383 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.070430 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnrs\" (UniqueName: \"kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.070655 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.070980 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.171959 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.172307 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.172353 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnrs\" (UniqueName: \"kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.172397 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.177613 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.178553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.189027 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.191076 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnrs\" (UniqueName: \"kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs\") pod \"nova-cell1-cell-mapping-z52x8\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.349878 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.451786 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:57:47 crc kubenswrapper[4895]: I1202 08:57:47.763093 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z52x8"] Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.132457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z52x8" event={"ID":"dbd9d193-9ec3-49b4-8cbb-050637dc04fc","Type":"ContainerStarted","Data":"77f51277e69bf3adcf07535ca57b14667516f908f22b0db5454b6f7e6d17c7b8"} Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.132818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z52x8" event={"ID":"dbd9d193-9ec3-49b4-8cbb-050637dc04fc","Type":"ContainerStarted","Data":"3a6118567f1b714c2569c19ba68024adda810ce86cc4eb26eb94ccaf37adadb9"} Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.151449 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z52x8" podStartSLOduration=2.151428033 podStartE2EDuration="2.151428033s" podCreationTimestamp="2025-12-02 08:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:57:48.146720417 +0000 UTC m=+5679.317580040" watchObservedRunningTime="2025-12-02 08:57:48.151428033 +0000 UTC m=+5679.322287646" Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.585829 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.585890 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.704540 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:57:48 crc kubenswrapper[4895]: I1202 08:57:48.704593 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:57:49 crc kubenswrapper[4895]: I1202 08:57:49.669294 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:57:49 crc kubenswrapper[4895]: I1202 08:57:49.670086 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:57:49 crc kubenswrapper[4895]: I1202 08:57:49.788104 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:57:49 crc kubenswrapper[4895]: I1202 08:57:49.788192 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:57:52 crc kubenswrapper[4895]: I1202 08:57:52.450527 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:57:52 crc kubenswrapper[4895]: I1202 08:57:52.486114 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:57:53 crc kubenswrapper[4895]: I1202 08:57:53.221949 4895 generic.go:334] "Generic (PLEG): container finished" podID="dbd9d193-9ec3-49b4-8cbb-050637dc04fc" containerID="77f51277e69bf3adcf07535ca57b14667516f908f22b0db5454b6f7e6d17c7b8" exitCode=0 Dec 02 08:57:53 crc kubenswrapper[4895]: I1202 08:57:53.222032 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z52x8" event={"ID":"dbd9d193-9ec3-49b4-8cbb-050637dc04fc","Type":"ContainerDied","Data":"77f51277e69bf3adcf07535ca57b14667516f908f22b0db5454b6f7e6d17c7b8"} Dec 02 08:57:53 crc kubenswrapper[4895]: I1202 08:57:53.261497 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.539290 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.659819 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnrs\" (UniqueName: \"kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs\") pod \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.660206 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data\") pod \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.660371 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts\") pod \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.661137 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle\") pod \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\" (UID: \"dbd9d193-9ec3-49b4-8cbb-050637dc04fc\") " Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.666104 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts" (OuterVolumeSpecName: "scripts") pod "dbd9d193-9ec3-49b4-8cbb-050637dc04fc" (UID: "dbd9d193-9ec3-49b4-8cbb-050637dc04fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.666973 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs" (OuterVolumeSpecName: "kube-api-access-rjnrs") pod "dbd9d193-9ec3-49b4-8cbb-050637dc04fc" (UID: "dbd9d193-9ec3-49b4-8cbb-050637dc04fc"). InnerVolumeSpecName "kube-api-access-rjnrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.687077 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data" (OuterVolumeSpecName: "config-data") pod "dbd9d193-9ec3-49b4-8cbb-050637dc04fc" (UID: "dbd9d193-9ec3-49b4-8cbb-050637dc04fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.687266 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd9d193-9ec3-49b4-8cbb-050637dc04fc" (UID: "dbd9d193-9ec3-49b4-8cbb-050637dc04fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.763875 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.763920 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.763932 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnrs\" (UniqueName: \"kubernetes.io/projected/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-kube-api-access-rjnrs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:54 crc kubenswrapper[4895]: I1202 08:57:54.763941 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd9d193-9ec3-49b4-8cbb-050637dc04fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.241150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z52x8" event={"ID":"dbd9d193-9ec3-49b4-8cbb-050637dc04fc","Type":"ContainerDied","Data":"3a6118567f1b714c2569c19ba68024adda810ce86cc4eb26eb94ccaf37adadb9"} Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.241191 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6118567f1b714c2569c19ba68024adda810ce86cc4eb26eb94ccaf37adadb9" Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.241227 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z52x8" Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.431311 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.431644 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-log" containerID="cri-o://6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b" gracePeriod=30 Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.431836 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-api" containerID="cri-o://bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc" gracePeriod=30 Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.443943 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.444184 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" containerName="nova-scheduler-scheduler" containerID="cri-o://ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" gracePeriod=30 Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.494161 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.494412 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-log" containerID="cri-o://9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed" gracePeriod=30 Dec 02 08:57:55 crc kubenswrapper[4895]: I1202 08:57:55.494575 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-metadata" containerID="cri-o://29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8" gracePeriod=30 Dec 02 08:57:56 crc kubenswrapper[4895]: I1202 08:57:56.256084 4895 generic.go:334] "Generic (PLEG): container finished" podID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerID="6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b" exitCode=143 Dec 02 08:57:56 crc kubenswrapper[4895]: I1202 08:57:56.256203 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerDied","Data":"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b"} Dec 02 08:57:56 crc kubenswrapper[4895]: I1202 08:57:56.259867 4895 generic.go:334] "Generic (PLEG): container finished" podID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerID="9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed" exitCode=143 Dec 02 08:57:56 crc kubenswrapper[4895]: I1202 08:57:56.259974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerDied","Data":"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed"} Dec 02 08:57:57 crc kubenswrapper[4895]: E1202 08:57:57.453076 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:57:57 crc kubenswrapper[4895]: E1202 08:57:57.454630 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:57:57 crc kubenswrapper[4895]: E1202 08:57:57.456506 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:57:57 crc kubenswrapper[4895]: E1202 08:57:57.456904 4895 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" containerName="nova-scheduler-scheduler" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.037416 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.047222 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.153174 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q7zs\" (UniqueName: \"kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs\") pod \"e5eee65f-af89-4505-9ab3-ce658f29d50a\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.153239 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data\") pod \"e5eee65f-af89-4505-9ab3-ce658f29d50a\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.153384 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data\") pod \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.154083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle\") pod \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.154148 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs\") pod \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.154188 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqf78\" (UniqueName: \"kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78\") pod \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\" (UID: \"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.154248 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle\") pod \"e5eee65f-af89-4505-9ab3-ce658f29d50a\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.154304 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs\") pod \"e5eee65f-af89-4505-9ab3-ce658f29d50a\" (UID: \"e5eee65f-af89-4505-9ab3-ce658f29d50a\") " Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.156617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs" (OuterVolumeSpecName: "logs") pod "e5eee65f-af89-4505-9ab3-ce658f29d50a" (UID: "e5eee65f-af89-4505-9ab3-ce658f29d50a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.157387 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs" (OuterVolumeSpecName: "logs") pod "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" (UID: "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.160570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs" (OuterVolumeSpecName: "kube-api-access-9q7zs") pod "e5eee65f-af89-4505-9ab3-ce658f29d50a" (UID: "e5eee65f-af89-4505-9ab3-ce658f29d50a"). InnerVolumeSpecName "kube-api-access-9q7zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.160951 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78" (OuterVolumeSpecName: "kube-api-access-kqf78") pod "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" (UID: "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b"). InnerVolumeSpecName "kube-api-access-kqf78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.184408 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5eee65f-af89-4505-9ab3-ce658f29d50a" (UID: "e5eee65f-af89-4505-9ab3-ce658f29d50a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.184901 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" (UID: "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.186359 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data" (OuterVolumeSpecName: "config-data") pod "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" (UID: "af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.186894 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data" (OuterVolumeSpecName: "config-data") pod "e5eee65f-af89-4505-9ab3-ce658f29d50a" (UID: "e5eee65f-af89-4505-9ab3-ce658f29d50a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256151 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256191 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256203 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256213 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqf78\" (UniqueName: \"kubernetes.io/projected/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b-kube-api-access-kqf78\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256223 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256233 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5eee65f-af89-4505-9ab3-ce658f29d50a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256243 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q7zs\" (UniqueName: \"kubernetes.io/projected/e5eee65f-af89-4505-9ab3-ce658f29d50a-kube-api-access-9q7zs\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.256252 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5eee65f-af89-4505-9ab3-ce658f29d50a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.290200 4895 generic.go:334] "Generic (PLEG): container finished" podID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerID="bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc" exitCode=0 Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.290254 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerDied","Data":"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc"} Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.290339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5eee65f-af89-4505-9ab3-ce658f29d50a","Type":"ContainerDied","Data":"8b37daed33787e56c2ab14f7203bfe651a6cad50f71b8aa4029d15038f11878a"} Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.290365 4895 scope.go:117] "RemoveContainer" containerID="bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.290634 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.294662 4895 generic.go:334] "Generic (PLEG): container finished" podID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerID="29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8" exitCode=0 Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.294721 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.295005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerDied","Data":"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8"} Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.295047 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b","Type":"ContainerDied","Data":"e75c750f4ab4116abe6f3f0f296dc857fbabd652de25b4abfc89c52b20d24548"} Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.318621 4895 scope.go:117] "RemoveContainer" containerID="6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.339412 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.351945 4895 scope.go:117] "RemoveContainer" containerID="bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.354240 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc\": container with ID starting with bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc not found: ID does not exist" containerID="bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.354292 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc"} err="failed to get container status \"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc\": rpc error: code = NotFound desc = could not find container \"bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc\": container with ID starting with bc3cbfdbd7a6e33e4248a28ec7818d50b9b5f15199e66a8f359d7d1e1b526bbc not found: ID does not exist" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.354339 4895 scope.go:117] "RemoveContainer" containerID="6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.354600 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b\": container with ID starting with 6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b not found: ID does not exist" containerID="6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.354638 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b"} err="failed to get container status \"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b\": rpc error: code = NotFound desc = could not find container \"6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b\": container with ID starting with 6bf87be0829703145491fd71754129fd01a60254e1ffdde700c0de671b139a2b not found: ID does not exist" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.354659 4895 scope.go:117] "RemoveContainer" containerID="29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.377883 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.395576 4895 scope.go:117] "RemoveContainer" containerID="9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.396499 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.408566 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435136 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.435606 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd9d193-9ec3-49b4-8cbb-050637dc04fc" containerName="nova-manage" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435624 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd9d193-9ec3-49b4-8cbb-050637dc04fc" containerName="nova-manage" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.435637 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-metadata" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435644 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-metadata" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.435655 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-log" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435660 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-log" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.435688 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-log" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435695 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-log" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.435707 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-api" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435712 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-api" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435910 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd9d193-9ec3-49b4-8cbb-050637dc04fc" containerName="nova-manage" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435925 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-api" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435935 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-log" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435943 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" containerName="nova-api-log" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.435953 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" containerName="nova-metadata-metadata" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.437803 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.440252 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.440555 4895 scope.go:117] "RemoveContainer" containerID="29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.441000 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8\": container with ID starting with 29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8 not found: ID does not exist" containerID="29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.441035 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8"} err="failed to get container status \"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8\": rpc error: code = NotFound desc = could not find container \"29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8\": container with ID starting with 29cf49e65df2290830f430596bdb5f9b14c5ddb557546b5075c8497646e810f8 not found: ID does not exist" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.441064 4895 scope.go:117] "RemoveContainer" containerID="9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed" Dec 02 08:57:59 crc kubenswrapper[4895]: E1202 08:57:59.441672 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed\": container with ID starting with 9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed not found: ID does not exist" containerID="9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.441700 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed"} err="failed to get container status \"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed\": rpc error: code = NotFound desc = could not find container \"9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed\": container with ID starting with 9d50a3d978f73c6b802c1d6f41acb8196ebeb4666dad0355429f4d20864e90ed not found: ID does not exist" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.459973 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.461601 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.461663 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.462637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.462675 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7jn\" (UniqueName: \"kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.468158 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.470568 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.473658 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.476883 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.564244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.564289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7jn\" (UniqueName: \"kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.564356 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.564374 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.565367 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.568557 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.568663 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.585584 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7jn\" (UniqueName: \"kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn\") pod \"nova-metadata-0\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " pod="openstack/nova-metadata-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.666311 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.666404 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.667084 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:57:59 crc kubenswrapper[4895]: I1202 08:57:59.667115 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrh9\" (UniqueName: \"kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.759971 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.769720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.769789 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrh9\" (UniqueName: \"kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.769856 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.769936 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.771241 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.778392 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.786347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.789071 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrh9\" (UniqueName: \"kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9\") pod \"nova-api-0\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:57:59.925290 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.081120 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle\") pod \"5f53a4dc-0378-4727-a729-b7d520e28874\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.081249 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9p7h\" (UniqueName: \"kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h\") pod \"5f53a4dc-0378-4727-a729-b7d520e28874\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.081697 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data\") pod \"5f53a4dc-0378-4727-a729-b7d520e28874\" (UID: \"5f53a4dc-0378-4727-a729-b7d520e28874\") " Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.086092 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h" (OuterVolumeSpecName: "kube-api-access-n9p7h") pod "5f53a4dc-0378-4727-a729-b7d520e28874" (UID: "5f53a4dc-0378-4727-a729-b7d520e28874"). InnerVolumeSpecName "kube-api-access-n9p7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.090890 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.113960 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data" (OuterVolumeSpecName: "config-data") pod "5f53a4dc-0378-4727-a729-b7d520e28874" (UID: "5f53a4dc-0378-4727-a729-b7d520e28874"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.122638 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f53a4dc-0378-4727-a729-b7d520e28874" (UID: "5f53a4dc-0378-4727-a729-b7d520e28874"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.184300 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.184341 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f53a4dc-0378-4727-a729-b7d520e28874-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.184358 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9p7h\" (UniqueName: \"kubernetes.io/projected/5f53a4dc-0378-4727-a729-b7d520e28874-kube-api-access-n9p7h\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.306043 4895 generic.go:334] "Generic (PLEG): container finished" podID="5f53a4dc-0378-4727-a729-b7d520e28874" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" exitCode=0 Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.306096 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.306129 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f53a4dc-0378-4727-a729-b7d520e28874","Type":"ContainerDied","Data":"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b"} Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.306192 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f53a4dc-0378-4727-a729-b7d520e28874","Type":"ContainerDied","Data":"05807c785822ee010894c5005d9a1dfce3045b15a7f219c89e08dee43a9a9a4b"} Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.306215 4895 scope.go:117] "RemoveContainer" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.337400 4895 scope.go:117] "RemoveContainer" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" Dec 02 08:58:00 crc kubenswrapper[4895]: E1202 08:58:00.338296 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b\": container with ID starting with ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b not found: ID does not exist" containerID="ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.338340 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b"} err="failed to get container status \"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b\": rpc error: code = NotFound desc = could not find container \"ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b\": container with ID starting with ec2a547d351296c4f7eba994d9d399694e172c25f99b50e500266f6a2706934b not found: ID does not exist" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.361025 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.375665 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.415280 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:00 crc kubenswrapper[4895]: E1202 08:58:00.416399 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" containerName="nova-scheduler-scheduler" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.416438 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" containerName="nova-scheduler-scheduler" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.416846 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" containerName="nova-scheduler-scheduler" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.417831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.420120 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.427356 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.493052 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrmx\" (UniqueName: \"kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.493131 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.493401 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.595441 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrmx\" (UniqueName: \"kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.595559 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.595604 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.599903 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.601462 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.612179 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrmx\" (UniqueName: \"kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx\") pod \"nova-scheduler-0\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.753466 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.876281 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:58:00 crc kubenswrapper[4895]: W1202 08:58:00.880356 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba0fc5d_edf7_4eac_b2a3_0a2fa14ea6db.slice/crio-3c6e1c10a9007d54d03b7f7a089173d86a6c667249ba495394a6209f673e5e0e WatchSource:0}: Error finding container 3c6e1c10a9007d54d03b7f7a089173d86a6c667249ba495394a6209f673e5e0e: Status 404 returned error can't find the container with id 3c6e1c10a9007d54d03b7f7a089173d86a6c667249ba495394a6209f673e5e0e Dec 02 08:58:00 crc kubenswrapper[4895]: I1202 08:58:00.887325 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.153821 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f53a4dc-0378-4727-a729-b7d520e28874" path="/var/lib/kubelet/pods/5f53a4dc-0378-4727-a729-b7d520e28874/volumes" Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.154673 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b" path="/var/lib/kubelet/pods/af0f2707-2c11-4b3e-bd6b-e6ff1f04a06b/volumes" Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.155874 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eee65f-af89-4505-9ab3-ce658f29d50a" path="/var/lib/kubelet/pods/e5eee65f-af89-4505-9ab3-ce658f29d50a/volumes" Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.210879 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:01 crc kubenswrapper[4895]: W1202 08:58:01.222165 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4581700a_677c_4be3_b004_b53b8b4d5f42.slice/crio-24ea3c73248f60453788a9aab0370f25558ad7f6ea7958fc301ee41f35020fd6 WatchSource:0}: Error finding container 24ea3c73248f60453788a9aab0370f25558ad7f6ea7958fc301ee41f35020fd6: Status 404 returned error can't find the container with id 24ea3c73248f60453788a9aab0370f25558ad7f6ea7958fc301ee41f35020fd6 Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.324153 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerStarted","Data":"347642b4e949226d454594ff3f8477f25e57b0b2c4ed371a8a76046b238c8a5a"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.324538 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerStarted","Data":"d8de976055e2285d9f987e2065884fe0f2a2279b0e9d2ed374f542fbcfb8e423"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.324551 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerStarted","Data":"3c6e1c10a9007d54d03b7f7a089173d86a6c667249ba495394a6209f673e5e0e"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.325848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4581700a-677c-4be3-b004-b53b8b4d5f42","Type":"ContainerStarted","Data":"24ea3c73248f60453788a9aab0370f25558ad7f6ea7958fc301ee41f35020fd6"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.328767 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerStarted","Data":"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.328807 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerStarted","Data":"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.328820 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerStarted","Data":"3a9cfe2fab6aed8a2019f8a7f701da99e42f28262d7ad2b5aa92ba4edc61a3f1"} Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.348984 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.348960031 podStartE2EDuration="2.348960031s" podCreationTimestamp="2025-12-02 08:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:01.341959393 +0000 UTC m=+5692.512819036" watchObservedRunningTime="2025-12-02 08:58:01.348960031 +0000 UTC m=+5692.519819654" Dec 02 08:58:01 crc kubenswrapper[4895]: I1202 08:58:01.369082 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.369060636 podStartE2EDuration="2.369060636s" podCreationTimestamp="2025-12-02 08:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:01.360035195 +0000 UTC m=+5692.530894808" watchObservedRunningTime="2025-12-02 08:58:01.369060636 +0000 UTC m=+5692.539920249" Dec 02 08:58:02 crc kubenswrapper[4895]: I1202 08:58:02.347425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4581700a-677c-4be3-b004-b53b8b4d5f42","Type":"ContainerStarted","Data":"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952"} Dec 02 08:58:02 crc kubenswrapper[4895]: I1202 08:58:02.370804 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.37078684 podStartE2EDuration="2.37078684s" podCreationTimestamp="2025-12-02 08:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:02.367130576 +0000 UTC m=+5693.537990189" watchObservedRunningTime="2025-12-02 08:58:02.37078684 +0000 UTC m=+5693.541646463" Dec 02 08:58:04 crc kubenswrapper[4895]: I1202 08:58:04.760576 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:58:04 crc kubenswrapper[4895]: I1202 08:58:04.760931 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.473559 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.474016 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.474080 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.475113 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.475198 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" gracePeriod=600 Dec 02 08:58:05 crc kubenswrapper[4895]: E1202 08:58:05.595969 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:05 crc kubenswrapper[4895]: I1202 08:58:05.754844 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:58:06 crc kubenswrapper[4895]: I1202 08:58:06.388331 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" exitCode=0 Dec 02 08:58:06 crc kubenswrapper[4895]: I1202 08:58:06.388380 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812"} Dec 02 08:58:06 crc kubenswrapper[4895]: I1202 08:58:06.388416 4895 scope.go:117] "RemoveContainer" containerID="0ee4c8392d6e79739cbb4ca35ecfead7d1526fc2afd1bf1fe50512c39f515cec" Dec 02 08:58:06 crc kubenswrapper[4895]: I1202 08:58:06.389396 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:58:06 crc kubenswrapper[4895]: E1202 08:58:06.389846 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:09 crc kubenswrapper[4895]: I1202 08:58:09.760385 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:58:09 crc kubenswrapper[4895]: I1202 08:58:09.761935 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.092173 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.092447 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.754103 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.788606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.845109 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:58:10 crc kubenswrapper[4895]: I1202 08:58:10.845495 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:58:11 crc kubenswrapper[4895]: I1202 08:58:11.175999 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:58:11 crc kubenswrapper[4895]: I1202 08:58:11.176026 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:58:11 crc kubenswrapper[4895]: I1202 08:58:11.478854 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:58:17 crc kubenswrapper[4895]: I1202 08:58:17.140846 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:58:17 crc kubenswrapper[4895]: E1202 08:58:17.141557 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:19 crc kubenswrapper[4895]: I1202 08:58:19.764142 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:58:19 crc kubenswrapper[4895]: I1202 08:58:19.764312 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:58:19 crc kubenswrapper[4895]: I1202 08:58:19.765831 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:58:19 crc kubenswrapper[4895]: I1202 08:58:19.767411 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.097010 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.097366 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.097794 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.097892 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.100266 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.106403 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.313847 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.315904 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.333694 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.494983 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dr9m\" (UniqueName: \"kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.495075 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.495202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.495353 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.495407 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.597868 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.597949 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.598011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.598067 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.598154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dr9m\" (UniqueName: \"kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.599444 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.600020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.600642 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.601337 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.624637 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dr9m\" (UniqueName: \"kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m\") pod \"dnsmasq-dns-7995d4655c-tqhwq\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:20 crc kubenswrapper[4895]: I1202 08:58:20.648165 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:21 crc kubenswrapper[4895]: I1202 08:58:21.228089 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:21 crc kubenswrapper[4895]: W1202 08:58:21.229857 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb158117_fd9d_4a7a_9bbb_68ae8d292a55.slice/crio-42b987c6bac923d5e49bcf3f51fc3ed1fc7744b33d052547c8fd89b4262bfbb3 WatchSource:0}: Error finding container 42b987c6bac923d5e49bcf3f51fc3ed1fc7744b33d052547c8fd89b4262bfbb3: Status 404 returned error can't find the container with id 42b987c6bac923d5e49bcf3f51fc3ed1fc7744b33d052547c8fd89b4262bfbb3 Dec 02 08:58:21 crc kubenswrapper[4895]: I1202 08:58:21.551668 4895 generic.go:334] "Generic (PLEG): container finished" podID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerID="57a8b1693c738fe772145bf176d158cb33e31ee7aaa002a5ca02fda5c1799f26" exitCode=0 Dec 02 08:58:21 crc kubenswrapper[4895]: I1202 08:58:21.553402 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" event={"ID":"bb158117-fd9d-4a7a-9bbb-68ae8d292a55","Type":"ContainerDied","Data":"57a8b1693c738fe772145bf176d158cb33e31ee7aaa002a5ca02fda5c1799f26"} Dec 02 08:58:21 crc kubenswrapper[4895]: I1202 08:58:21.553442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" event={"ID":"bb158117-fd9d-4a7a-9bbb-68ae8d292a55","Type":"ContainerStarted","Data":"42b987c6bac923d5e49bcf3f51fc3ed1fc7744b33d052547c8fd89b4262bfbb3"} Dec 02 08:58:22 crc kubenswrapper[4895]: I1202 08:58:22.564226 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" event={"ID":"bb158117-fd9d-4a7a-9bbb-68ae8d292a55","Type":"ContainerStarted","Data":"38d3472bc99b3c76ae64de6eb3753c7a04f3ffdeac238a4f81f5b4a605d4f08b"} Dec 02 08:58:22 crc kubenswrapper[4895]: I1202 08:58:22.565088 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:22 crc kubenswrapper[4895]: I1202 08:58:22.586077 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" podStartSLOduration=2.586057832 podStartE2EDuration="2.586057832s" podCreationTimestamp="2025-12-02 08:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:22.582642825 +0000 UTC m=+5713.753502438" watchObservedRunningTime="2025-12-02 08:58:22.586057832 +0000 UTC m=+5713.756917435" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.244497 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.247475 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.261896 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.385209 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsb5p\" (UniqueName: \"kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.385254 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.385576 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.487527 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsb5p\" (UniqueName: \"kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.487582 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.487652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.488351 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.489017 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.512231 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsb5p\" (UniqueName: \"kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p\") pod \"redhat-operators-pw6cq\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:24 crc kubenswrapper[4895]: I1202 08:58:24.568464 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:25 crc kubenswrapper[4895]: W1202 08:58:25.108076 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod887e8023_35ab_4221_bc25_86a9c5a096e7.slice/crio-5a6c1ca83182bd672b7f2b5acacc52dbd545497343fdf7932eed15d0e11eaa8d WatchSource:0}: Error finding container 5a6c1ca83182bd672b7f2b5acacc52dbd545497343fdf7932eed15d0e11eaa8d: Status 404 returned error can't find the container with id 5a6c1ca83182bd672b7f2b5acacc52dbd545497343fdf7932eed15d0e11eaa8d Dec 02 08:58:25 crc kubenswrapper[4895]: I1202 08:58:25.108441 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:25 crc kubenswrapper[4895]: I1202 08:58:25.593470 4895 generic.go:334] "Generic (PLEG): container finished" podID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerID="b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110" exitCode=0 Dec 02 08:58:25 crc kubenswrapper[4895]: I1202 08:58:25.593586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerDied","Data":"b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110"} Dec 02 08:58:25 crc kubenswrapper[4895]: I1202 08:58:25.593835 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerStarted","Data":"5a6c1ca83182bd672b7f2b5acacc52dbd545497343fdf7932eed15d0e11eaa8d"} Dec 02 08:58:25 crc kubenswrapper[4895]: I1202 08:58:25.595849 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:58:28 crc kubenswrapper[4895]: I1202 08:58:28.617084 4895 generic.go:334] "Generic (PLEG): container finished" podID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerID="b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8" exitCode=0 Dec 02 08:58:28 crc kubenswrapper[4895]: I1202 08:58:28.617177 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerDied","Data":"b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8"} Dec 02 08:58:29 crc kubenswrapper[4895]: I1202 08:58:29.148323 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:58:29 crc kubenswrapper[4895]: E1202 08:58:29.148589 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:29 crc kubenswrapper[4895]: I1202 08:58:29.664886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerStarted","Data":"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796"} Dec 02 08:58:29 crc kubenswrapper[4895]: I1202 08:58:29.683905 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pw6cq" podStartSLOduration=2.047813888 podStartE2EDuration="5.683877297s" podCreationTimestamp="2025-12-02 08:58:24 +0000 UTC" firstStartedPulling="2025-12-02 08:58:25.595527197 +0000 UTC m=+5716.766386810" lastFinishedPulling="2025-12-02 08:58:29.231590606 +0000 UTC m=+5720.402450219" observedRunningTime="2025-12-02 08:58:29.681869195 +0000 UTC m=+5720.852728828" watchObservedRunningTime="2025-12-02 08:58:29.683877297 +0000 UTC m=+5720.854736910" Dec 02 08:58:30 crc kubenswrapper[4895]: I1202 08:58:30.649961 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:30 crc kubenswrapper[4895]: I1202 08:58:30.714263 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:58:30 crc kubenswrapper[4895]: I1202 08:58:30.714632 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="dnsmasq-dns" containerID="cri-o://3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9" gracePeriod=10 Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.283671 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.441001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config\") pod \"d76b17de-f566-4259-b60e-95a57cb3a975\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.441477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb\") pod \"d76b17de-f566-4259-b60e-95a57cb3a975\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.441560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc\") pod \"d76b17de-f566-4259-b60e-95a57cb3a975\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.441607 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrpg\" (UniqueName: \"kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg\") pod \"d76b17de-f566-4259-b60e-95a57cb3a975\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.441657 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb\") pod \"d76b17de-f566-4259-b60e-95a57cb3a975\" (UID: \"d76b17de-f566-4259-b60e-95a57cb3a975\") " Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.462371 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg" (OuterVolumeSpecName: "kube-api-access-lmrpg") pod "d76b17de-f566-4259-b60e-95a57cb3a975" (UID: "d76b17de-f566-4259-b60e-95a57cb3a975"). InnerVolumeSpecName "kube-api-access-lmrpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.493704 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d76b17de-f566-4259-b60e-95a57cb3a975" (UID: "d76b17de-f566-4259-b60e-95a57cb3a975"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.508377 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config" (OuterVolumeSpecName: "config") pod "d76b17de-f566-4259-b60e-95a57cb3a975" (UID: "d76b17de-f566-4259-b60e-95a57cb3a975"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.518804 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d76b17de-f566-4259-b60e-95a57cb3a975" (UID: "d76b17de-f566-4259-b60e-95a57cb3a975"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.531612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d76b17de-f566-4259-b60e-95a57cb3a975" (UID: "d76b17de-f566-4259-b60e-95a57cb3a975"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.544076 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.544125 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.544140 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.544153 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrpg\" (UniqueName: \"kubernetes.io/projected/d76b17de-f566-4259-b60e-95a57cb3a975-kube-api-access-lmrpg\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.544166 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d76b17de-f566-4259-b60e-95a57cb3a975-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.697952 4895 generic.go:334] "Generic (PLEG): container finished" podID="d76b17de-f566-4259-b60e-95a57cb3a975" containerID="3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9" exitCode=0 Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.697999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" event={"ID":"d76b17de-f566-4259-b60e-95a57cb3a975","Type":"ContainerDied","Data":"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9"} Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.698028 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" event={"ID":"d76b17de-f566-4259-b60e-95a57cb3a975","Type":"ContainerDied","Data":"7c555fa887686e788a5df3e67726685b5c66bb0c66c71632470e4497e32a4b24"} Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.698029 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc9f46b9f-5tt2h" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.698052 4895 scope.go:117] "RemoveContainer" containerID="3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.733037 4895 scope.go:117] "RemoveContainer" containerID="2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.736233 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.754221 4895 scope.go:117] "RemoveContainer" containerID="3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9" Dec 02 08:58:31 crc kubenswrapper[4895]: E1202 08:58:31.754781 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9\": container with ID starting with 3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9 not found: ID does not exist" containerID="3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.754840 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9"} err="failed to get container status \"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9\": rpc error: code = NotFound desc = could not find container \"3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9\": container with ID starting with 3dd932aa59ba74a12d52b102faaf3f766798dbd7a056d06698c0da56d97dd2b9 not found: ID does not exist" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.754868 4895 scope.go:117] "RemoveContainer" containerID="2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.754918 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc9f46b9f-5tt2h"] Dec 02 08:58:31 crc kubenswrapper[4895]: E1202 08:58:31.755216 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d\": container with ID starting with 2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d not found: ID does not exist" containerID="2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d" Dec 02 08:58:31 crc kubenswrapper[4895]: I1202 08:58:31.755256 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d"} err="failed to get container status \"2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d\": rpc error: code = NotFound desc = could not find container \"2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d\": container with ID starting with 2e03e816feaf1535197df409554dc6014bea53fc1cd377d98c18dd7b8a6e9c3d not found: ID does not exist" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.169142 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" path="/var/lib/kubelet/pods/d76b17de-f566-4259-b60e-95a57cb3a975/volumes" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.718870 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-p26bh"] Dec 02 08:58:33 crc kubenswrapper[4895]: E1202 08:58:33.719440 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="dnsmasq-dns" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.719459 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="dnsmasq-dns" Dec 02 08:58:33 crc kubenswrapper[4895]: E1202 08:58:33.719490 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="init" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.719499 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="init" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.719802 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76b17de-f566-4259-b60e-95a57cb3a975" containerName="dnsmasq-dns" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.720732 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.747414 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p26bh"] Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.782126 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-93ba-account-create-update-svtc4"] Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.790515 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.793713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.794529 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-93ba-account-create-update-svtc4"] Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.887294 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmvfm\" (UniqueName: \"kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.887422 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.887450 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62gxq\" (UniqueName: \"kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.887472 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.989896 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmvfm\" (UniqueName: \"kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.989970 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.989991 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62gxq\" (UniqueName: \"kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.990013 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.990937 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:33 crc kubenswrapper[4895]: I1202 08:58:33.991151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.014689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmvfm\" (UniqueName: \"kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm\") pod \"cinder-db-create-p26bh\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.015218 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62gxq\" (UniqueName: \"kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq\") pod \"cinder-93ba-account-create-update-svtc4\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.046294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.125937 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.563175 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p26bh"] Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.568629 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.568695 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.625016 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.709581 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-93ba-account-create-update-svtc4"] Dec 02 08:58:34 crc kubenswrapper[4895]: W1202 08:58:34.713888 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013e486c_9468_4687_b255_b9492896f50b.slice/crio-7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b WatchSource:0}: Error finding container 7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b: Status 404 returned error can't find the container with id 7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.741437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-93ba-account-create-update-svtc4" event={"ID":"013e486c-9468-4687-b255-b9492896f50b","Type":"ContainerStarted","Data":"7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b"} Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.744393 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p26bh" event={"ID":"62314e04-2f0b-4cea-a952-aec25fc0799b","Type":"ContainerStarted","Data":"b5c58d873aa8c146b1c45ea361eb58f595bd3d1db695c00651752fbd811d955f"} Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.793456 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:34 crc kubenswrapper[4895]: I1202 08:58:34.864122 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:35 crc kubenswrapper[4895]: I1202 08:58:35.757479 4895 generic.go:334] "Generic (PLEG): container finished" podID="62314e04-2f0b-4cea-a952-aec25fc0799b" containerID="0ff688b4e620048852f79fafd7645496df6c308ceb4e6c2f160b19a9009ceb0f" exitCode=0 Dec 02 08:58:35 crc kubenswrapper[4895]: I1202 08:58:35.757543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p26bh" event={"ID":"62314e04-2f0b-4cea-a952-aec25fc0799b","Type":"ContainerDied","Data":"0ff688b4e620048852f79fafd7645496df6c308ceb4e6c2f160b19a9009ceb0f"} Dec 02 08:58:35 crc kubenswrapper[4895]: I1202 08:58:35.760621 4895 generic.go:334] "Generic (PLEG): container finished" podID="013e486c-9468-4687-b255-b9492896f50b" containerID="8604e1e270b85c7e0f9829e1dc01f238dce35039d0d9ddbc8eced80a7bb02ce7" exitCode=0 Dec 02 08:58:35 crc kubenswrapper[4895]: I1202 08:58:35.761413 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-93ba-account-create-update-svtc4" event={"ID":"013e486c-9468-4687-b255-b9492896f50b","Type":"ContainerDied","Data":"8604e1e270b85c7e0f9829e1dc01f238dce35039d0d9ddbc8eced80a7bb02ce7"} Dec 02 08:58:36 crc kubenswrapper[4895]: I1202 08:58:36.772651 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pw6cq" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="registry-server" containerID="cri-o://4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796" gracePeriod=2 Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.340897 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.369341 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.384812 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts\") pod \"013e486c-9468-4687-b255-b9492896f50b\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.384945 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmvfm\" (UniqueName: \"kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm\") pod \"62314e04-2f0b-4cea-a952-aec25fc0799b\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.384979 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62gxq\" (UniqueName: \"kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq\") pod \"013e486c-9468-4687-b255-b9492896f50b\" (UID: \"013e486c-9468-4687-b255-b9492896f50b\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.385040 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts\") pod \"62314e04-2f0b-4cea-a952-aec25fc0799b\" (UID: \"62314e04-2f0b-4cea-a952-aec25fc0799b\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.385518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "013e486c-9468-4687-b255-b9492896f50b" (UID: "013e486c-9468-4687-b255-b9492896f50b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.385708 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62314e04-2f0b-4cea-a952-aec25fc0799b" (UID: "62314e04-2f0b-4cea-a952-aec25fc0799b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.393206 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm" (OuterVolumeSpecName: "kube-api-access-jmvfm") pod "62314e04-2f0b-4cea-a952-aec25fc0799b" (UID: "62314e04-2f0b-4cea-a952-aec25fc0799b"). InnerVolumeSpecName "kube-api-access-jmvfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.405822 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq" (OuterVolumeSpecName: "kube-api-access-62gxq") pod "013e486c-9468-4687-b255-b9492896f50b" (UID: "013e486c-9468-4687-b255-b9492896f50b"). InnerVolumeSpecName "kube-api-access-62gxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.441250 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.486301 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content\") pod \"887e8023-35ab-4221-bc25-86a9c5a096e7\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.486805 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities\") pod \"887e8023-35ab-4221-bc25-86a9c5a096e7\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.486860 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsb5p\" (UniqueName: \"kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p\") pod \"887e8023-35ab-4221-bc25-86a9c5a096e7\" (UID: \"887e8023-35ab-4221-bc25-86a9c5a096e7\") " Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.487167 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmvfm\" (UniqueName: \"kubernetes.io/projected/62314e04-2f0b-4cea-a952-aec25fc0799b-kube-api-access-jmvfm\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.487185 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62gxq\" (UniqueName: \"kubernetes.io/projected/013e486c-9468-4687-b255-b9492896f50b-kube-api-access-62gxq\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.487195 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62314e04-2f0b-4cea-a952-aec25fc0799b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.487204 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/013e486c-9468-4687-b255-b9492896f50b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.487700 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities" (OuterVolumeSpecName: "utilities") pod "887e8023-35ab-4221-bc25-86a9c5a096e7" (UID: "887e8023-35ab-4221-bc25-86a9c5a096e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.491125 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p" (OuterVolumeSpecName: "kube-api-access-rsb5p") pod "887e8023-35ab-4221-bc25-86a9c5a096e7" (UID: "887e8023-35ab-4221-bc25-86a9c5a096e7"). InnerVolumeSpecName "kube-api-access-rsb5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.588174 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.588238 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsb5p\" (UniqueName: \"kubernetes.io/projected/887e8023-35ab-4221-bc25-86a9c5a096e7-kube-api-access-rsb5p\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.613441 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "887e8023-35ab-4221-bc25-86a9c5a096e7" (UID: "887e8023-35ab-4221-bc25-86a9c5a096e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.690699 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887e8023-35ab-4221-bc25-86a9c5a096e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.786244 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p26bh" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.786254 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p26bh" event={"ID":"62314e04-2f0b-4cea-a952-aec25fc0799b","Type":"ContainerDied","Data":"b5c58d873aa8c146b1c45ea361eb58f595bd3d1db695c00651752fbd811d955f"} Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.786320 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5c58d873aa8c146b1c45ea361eb58f595bd3d1db695c00651752fbd811d955f" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.788490 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-93ba-account-create-update-svtc4" event={"ID":"013e486c-9468-4687-b255-b9492896f50b","Type":"ContainerDied","Data":"7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b"} Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.788528 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-93ba-account-create-update-svtc4" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.788530 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7788cb1b1bc0369548a583bac396650a65e8a30099c724526900c873f7ce7d8b" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.793694 4895 generic.go:334] "Generic (PLEG): container finished" podID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerID="4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796" exitCode=0 Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.793877 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerDied","Data":"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796"} Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.793959 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pw6cq" event={"ID":"887e8023-35ab-4221-bc25-86a9c5a096e7","Type":"ContainerDied","Data":"5a6c1ca83182bd672b7f2b5acacc52dbd545497343fdf7932eed15d0e11eaa8d"} Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.793981 4895 scope.go:117] "RemoveContainer" containerID="4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.794004 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pw6cq" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.829817 4895 scope.go:117] "RemoveContainer" containerID="b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.857666 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.867204 4895 scope.go:117] "RemoveContainer" containerID="b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.869914 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pw6cq"] Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.902935 4895 scope.go:117] "RemoveContainer" containerID="4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796" Dec 02 08:58:37 crc kubenswrapper[4895]: E1202 08:58:37.904953 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796\": container with ID starting with 4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796 not found: ID does not exist" containerID="4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.905000 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796"} err="failed to get container status \"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796\": rpc error: code = NotFound desc = could not find container \"4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796\": container with ID starting with 4c5bafb4cfb668a90be05235c4226b37d8a5a6afd8a65a78fbdf06efc15c4796 not found: ID does not exist" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.905027 4895 scope.go:117] "RemoveContainer" containerID="b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8" Dec 02 08:58:37 crc kubenswrapper[4895]: E1202 08:58:37.906853 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8\": container with ID starting with b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8 not found: ID does not exist" containerID="b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.906880 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8"} err="failed to get container status \"b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8\": rpc error: code = NotFound desc = could not find container \"b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8\": container with ID starting with b51361206c7c1c424a288910bd3d6485b9285279e7240d213574bd29cd1fdad8 not found: ID does not exist" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.906895 4895 scope.go:117] "RemoveContainer" containerID="b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110" Dec 02 08:58:37 crc kubenswrapper[4895]: E1202 08:58:37.907351 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110\": container with ID starting with b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110 not found: ID does not exist" containerID="b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110" Dec 02 08:58:37 crc kubenswrapper[4895]: I1202 08:58:37.907378 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110"} err="failed to get container status \"b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110\": rpc error: code = NotFound desc = could not find container \"b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110\": container with ID starting with b7af8156261aa97c7894e29a75af0d22d73a8e69af3898f91efbe3557f435110 not found: ID does not exist" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.945006 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tqxqk"] Dec 02 08:58:38 crc kubenswrapper[4895]: E1202 08:58:38.945669 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="extract-utilities" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.945685 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="extract-utilities" Dec 02 08:58:38 crc kubenswrapper[4895]: E1202 08:58:38.945699 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013e486c-9468-4687-b255-b9492896f50b" containerName="mariadb-account-create-update" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.945705 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="013e486c-9468-4687-b255-b9492896f50b" containerName="mariadb-account-create-update" Dec 02 08:58:38 crc kubenswrapper[4895]: E1202 08:58:38.945724 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="registry-server" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.945731 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="registry-server" Dec 02 08:58:38 crc kubenswrapper[4895]: E1202 08:58:38.945744 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="extract-content" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.945752 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="extract-content" Dec 02 08:58:38 crc kubenswrapper[4895]: E1202 08:58:38.946131 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62314e04-2f0b-4cea-a952-aec25fc0799b" containerName="mariadb-database-create" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.946139 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="62314e04-2f0b-4cea-a952-aec25fc0799b" containerName="mariadb-database-create" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.946325 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" containerName="registry-server" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.946337 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="013e486c-9468-4687-b255-b9492896f50b" containerName="mariadb-account-create-update" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.946351 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="62314e04-2f0b-4cea-a952-aec25fc0799b" containerName="mariadb-database-create" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.947030 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.952164 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jzlmc" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.952438 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.957312 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 08:58:38 crc kubenswrapper[4895]: I1202 08:58:38.963405 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tqxqk"] Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.043898 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.043998 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcq4s\" (UniqueName: \"kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.044042 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.044506 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.044662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.044726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146627 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146714 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcq4s\" (UniqueName: \"kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146769 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146851 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146881 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.146910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.147993 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.152744 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.154444 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.155217 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.158129 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.160388 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887e8023-35ab-4221-bc25-86a9c5a096e7" path="/var/lib/kubelet/pods/887e8023-35ab-4221-bc25-86a9c5a096e7/volumes" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.167326 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcq4s\" (UniqueName: \"kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s\") pod \"cinder-db-sync-tqxqk\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.266397 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.729709 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tqxqk"] Dec 02 08:58:39 crc kubenswrapper[4895]: W1202 08:58:39.736397 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce80648_ccb2_4ba8_802c_c8afafb13ab6.slice/crio-f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59 WatchSource:0}: Error finding container f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59: Status 404 returned error can't find the container with id f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59 Dec 02 08:58:39 crc kubenswrapper[4895]: I1202 08:58:39.817353 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqxqk" event={"ID":"cce80648-ccb2-4ba8-802c-c8afafb13ab6","Type":"ContainerStarted","Data":"f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59"} Dec 02 08:58:40 crc kubenswrapper[4895]: I1202 08:58:40.827494 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqxqk" event={"ID":"cce80648-ccb2-4ba8-802c-c8afafb13ab6","Type":"ContainerStarted","Data":"9da84d1a18ac0bb209fcd43fd55a4f2bac1dec8f7173fe1570df1a76df09070e"} Dec 02 08:58:40 crc kubenswrapper[4895]: I1202 08:58:40.848770 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tqxqk" podStartSLOduration=2.848739018 podStartE2EDuration="2.848739018s" podCreationTimestamp="2025-12-02 08:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:40.84300826 +0000 UTC m=+5732.013867883" watchObservedRunningTime="2025-12-02 08:58:40.848739018 +0000 UTC m=+5732.019598631" Dec 02 08:58:41 crc kubenswrapper[4895]: I1202 08:58:41.141909 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:58:41 crc kubenswrapper[4895]: E1202 08:58:41.142453 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:43 crc kubenswrapper[4895]: I1202 08:58:43.854779 4895 generic.go:334] "Generic (PLEG): container finished" podID="cce80648-ccb2-4ba8-802c-c8afafb13ab6" containerID="9da84d1a18ac0bb209fcd43fd55a4f2bac1dec8f7173fe1570df1a76df09070e" exitCode=0 Dec 02 08:58:43 crc kubenswrapper[4895]: I1202 08:58:43.856860 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqxqk" event={"ID":"cce80648-ccb2-4ba8-802c-c8afafb13ab6","Type":"ContainerDied","Data":"9da84d1a18ac0bb209fcd43fd55a4f2bac1dec8f7173fe1570df1a76df09070e"} Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.264804 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373182 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373264 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373338 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373419 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcq4s\" (UniqueName: \"kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373462 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373490 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle\") pod \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\" (UID: \"cce80648-ccb2-4ba8-802c-c8afafb13ab6\") " Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.373925 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.374583 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce80648-ccb2-4ba8-802c-c8afafb13ab6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.379297 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts" (OuterVolumeSpecName: "scripts") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.379331 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.379644 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s" (OuterVolumeSpecName: "kube-api-access-gcq4s") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "kube-api-access-gcq4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.406292 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.421620 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data" (OuterVolumeSpecName: "config-data") pod "cce80648-ccb2-4ba8-802c-c8afafb13ab6" (UID: "cce80648-ccb2-4ba8-802c-c8afafb13ab6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.476771 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.476830 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.476846 4895 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.476858 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce80648-ccb2-4ba8-802c-c8afafb13ab6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.476871 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcq4s\" (UniqueName: \"kubernetes.io/projected/cce80648-ccb2-4ba8-802c-c8afafb13ab6-kube-api-access-gcq4s\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.875937 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqxqk" event={"ID":"cce80648-ccb2-4ba8-802c-c8afafb13ab6","Type":"ContainerDied","Data":"f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59"} Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.875984 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4027deaf01254b7b380097f1cd38bdfccf753da58a0d9a9a16e73a7396fed59" Dec 02 08:58:45 crc kubenswrapper[4895]: I1202 08:58:45.876048 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqxqk" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.263550 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 08:58:46 crc kubenswrapper[4895]: E1202 08:58:46.264000 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce80648-ccb2-4ba8-802c-c8afafb13ab6" containerName="cinder-db-sync" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.264019 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce80648-ccb2-4ba8-802c-c8afafb13ab6" containerName="cinder-db-sync" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.264234 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce80648-ccb2-4ba8-802c-c8afafb13ab6" containerName="cinder-db-sync" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.265438 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.314255 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.402235 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.402733 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwntn\" (UniqueName: \"kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.402913 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.403041 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.403071 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.505110 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.505539 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwntn\" (UniqueName: \"kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.505657 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.505823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.505947 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.506594 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.506934 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.507036 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.507222 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.518165 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.519807 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.521724 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.522480 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jzlmc" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.522546 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.522724 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.527640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwntn\" (UniqueName: \"kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn\") pod \"dnsmasq-dns-8844f56df-tmgd5\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.536160 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.607453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.607506 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.607531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.607932 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.607986 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wfkh\" (UniqueName: \"kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.608567 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.608627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.608651 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711350 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711439 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711475 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711502 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711621 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711649 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wfkh\" (UniqueName: \"kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.711741 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.712329 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.716867 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.717899 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.718971 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.719645 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.734918 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wfkh\" (UniqueName: \"kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh\") pod \"cinder-api-0\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " pod="openstack/cinder-api-0" Dec 02 08:58:46 crc kubenswrapper[4895]: I1202 08:58:46.904690 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.180773 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.439694 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:58:47 crc kubenswrapper[4895]: W1202 08:58:47.448458 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e6bdac4_cbc7_4a33_a2f8_95f346673ee1.slice/crio-818919f127e8ef487f4db6c6286e0624b087cc79f98ad6a6633aa28751a12dd2 WatchSource:0}: Error finding container 818919f127e8ef487f4db6c6286e0624b087cc79f98ad6a6633aa28751a12dd2: Status 404 returned error can't find the container with id 818919f127e8ef487f4db6c6286e0624b087cc79f98ad6a6633aa28751a12dd2 Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.903068 4895 generic.go:334] "Generic (PLEG): container finished" podID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerID="291b2922a560b070518508ca07ae54035db019788cc315c3c32b0a88ba89aaa1" exitCode=0 Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.903605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" event={"ID":"3627f865-5ee4-49f3-8b96-96cc1b94ec6e","Type":"ContainerDied","Data":"291b2922a560b070518508ca07ae54035db019788cc315c3c32b0a88ba89aaa1"} Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.903646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" event={"ID":"3627f865-5ee4-49f3-8b96-96cc1b94ec6e","Type":"ContainerStarted","Data":"b0d5d346f62edb3673669a053519fdedadfbfcb21db8d8586e2b262de4c76987"} Dec 02 08:58:47 crc kubenswrapper[4895]: I1202 08:58:47.905823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerStarted","Data":"818919f127e8ef487f4db6c6286e0624b087cc79f98ad6a6633aa28751a12dd2"} Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.916809 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" event={"ID":"3627f865-5ee4-49f3-8b96-96cc1b94ec6e","Type":"ContainerStarted","Data":"fef8dba04b04619b0c9ea61e2d8af31dcdc4a776814001ec9cc02f811b6ca453"} Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.918271 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.921872 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerStarted","Data":"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9"} Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.922048 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerStarted","Data":"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67"} Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.922119 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.943833 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" podStartSLOduration=2.943814497 podStartE2EDuration="2.943814497s" podCreationTimestamp="2025-12-02 08:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:48.937159821 +0000 UTC m=+5740.108019434" watchObservedRunningTime="2025-12-02 08:58:48.943814497 +0000 UTC m=+5740.114674110" Dec 02 08:58:48 crc kubenswrapper[4895]: I1202 08:58:48.965720 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.965693718 podStartE2EDuration="2.965693718s" podCreationTimestamp="2025-12-02 08:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:58:48.956693319 +0000 UTC m=+5740.127552952" watchObservedRunningTime="2025-12-02 08:58:48.965693718 +0000 UTC m=+5740.136553341" Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.141658 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:58:56 crc kubenswrapper[4895]: E1202 08:58:56.142540 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.611092 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.669076 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.669576 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="dnsmasq-dns" containerID="cri-o://38d3472bc99b3c76ae64de6eb3753c7a04f3ffdeac238a4f81f5b4a605d4f08b" gracePeriod=10 Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.994280 4895 generic.go:334] "Generic (PLEG): container finished" podID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerID="38d3472bc99b3c76ae64de6eb3753c7a04f3ffdeac238a4f81f5b4a605d4f08b" exitCode=0 Dec 02 08:58:56 crc kubenswrapper[4895]: I1202 08:58:56.994645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" event={"ID":"bb158117-fd9d-4a7a-9bbb-68ae8d292a55","Type":"ContainerDied","Data":"38d3472bc99b3c76ae64de6eb3753c7a04f3ffdeac238a4f81f5b4a605d4f08b"} Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.203990 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.342386 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dr9m\" (UniqueName: \"kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m\") pod \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.342794 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb\") pod \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.342891 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc\") pod \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.343168 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb\") pod \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.343299 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config\") pod \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\" (UID: \"bb158117-fd9d-4a7a-9bbb-68ae8d292a55\") " Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.351010 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m" (OuterVolumeSpecName: "kube-api-access-8dr9m") pod "bb158117-fd9d-4a7a-9bbb-68ae8d292a55" (UID: "bb158117-fd9d-4a7a-9bbb-68ae8d292a55"). InnerVolumeSpecName "kube-api-access-8dr9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.397579 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb158117-fd9d-4a7a-9bbb-68ae8d292a55" (UID: "bb158117-fd9d-4a7a-9bbb-68ae8d292a55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.402589 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb158117-fd9d-4a7a-9bbb-68ae8d292a55" (UID: "bb158117-fd9d-4a7a-9bbb-68ae8d292a55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.413038 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb158117-fd9d-4a7a-9bbb-68ae8d292a55" (UID: "bb158117-fd9d-4a7a-9bbb-68ae8d292a55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.413061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config" (OuterVolumeSpecName: "config") pod "bb158117-fd9d-4a7a-9bbb-68ae8d292a55" (UID: "bb158117-fd9d-4a7a-9bbb-68ae8d292a55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.445154 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.445189 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dr9m\" (UniqueName: \"kubernetes.io/projected/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-kube-api-access-8dr9m\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.445200 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.445209 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:57 crc kubenswrapper[4895]: I1202 08:58:57.445221 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb158117-fd9d-4a7a-9bbb-68ae8d292a55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.004002 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" event={"ID":"bb158117-fd9d-4a7a-9bbb-68ae8d292a55","Type":"ContainerDied","Data":"42b987c6bac923d5e49bcf3f51fc3ed1fc7744b33d052547c8fd89b4262bfbb3"} Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.004320 4895 scope.go:117] "RemoveContainer" containerID="38d3472bc99b3c76ae64de6eb3753c7a04f3ffdeac238a4f81f5b4a605d4f08b" Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.004464 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7995d4655c-tqhwq" Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.050694 4895 scope.go:117] "RemoveContainer" containerID="57a8b1693c738fe772145bf176d158cb33e31ee7aaa002a5ca02fda5c1799f26" Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.051446 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.061878 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7995d4655c-tqhwq"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.334181 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.334486 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4581700a-677c-4be3-b004-b53b8b4d5f42" containerName="nova-scheduler-scheduler" containerID="cri-o://9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.363004 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.363309 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" containerID="cri-o://d8de976055e2285d9f987e2065884fe0f2a2279b0e9d2ed374f542fbcfb8e423" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.363579 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" containerID="cri-o://347642b4e949226d454594ff3f8477f25e57b0b2c4ed371a8a76046b238c8a5a" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.387329 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.387568 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a4b6862707d31377831d1999c1f773f9f1687d41cbe8a19027aa93249bfde329" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.393868 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.394213 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" containerID="cri-o://5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.394254 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" containerID="cri-o://288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0" gracePeriod=30 Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.408561 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:58:58 crc kubenswrapper[4895]: I1202 08:58:58.408952 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7e9d737b175822cdc45bd8e52baeb99570dfc11e6cd8cd418de28a8a4ee982c7" gracePeriod=30 Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.018820 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerID="d8de976055e2285d9f987e2065884fe0f2a2279b0e9d2ed374f542fbcfb8e423" exitCode=143 Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.018956 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerDied","Data":"d8de976055e2285d9f987e2065884fe0f2a2279b0e9d2ed374f542fbcfb8e423"} Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.022726 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerDied","Data":"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01"} Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.023154 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerID="5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01" exitCode=143 Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.027266 4895 generic.go:334] "Generic (PLEG): container finished" podID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" containerID="7e9d737b175822cdc45bd8e52baeb99570dfc11e6cd8cd418de28a8a4ee982c7" exitCode=0 Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.027308 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76e3c0b2-3994-476b-aead-29d0c1a7d7ce","Type":"ContainerDied","Data":"7e9d737b175822cdc45bd8e52baeb99570dfc11e6cd8cd418de28a8a4ee982c7"} Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.156002 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" path="/var/lib/kubelet/pods/bb158117-fd9d-4a7a-9bbb-68ae8d292a55/volumes" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.156676 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.534484 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.590015 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svs65\" (UniqueName: \"kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65\") pod \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.590121 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle\") pod \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.590149 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data\") pod \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\" (UID: \"76e3c0b2-3994-476b-aead-29d0c1a7d7ce\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.639448 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data" (OuterVolumeSpecName: "config-data") pod "76e3c0b2-3994-476b-aead-29d0c1a7d7ce" (UID: "76e3c0b2-3994-476b-aead-29d0c1a7d7ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.649058 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e3c0b2-3994-476b-aead-29d0c1a7d7ce" (UID: "76e3c0b2-3994-476b-aead-29d0c1a7d7ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.649173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65" (OuterVolumeSpecName: "kube-api-access-svs65") pod "76e3c0b2-3994-476b-aead-29d0c1a7d7ce" (UID: "76e3c0b2-3994-476b-aead-29d0c1a7d7ce"). InnerVolumeSpecName "kube-api-access-svs65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.693606 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svs65\" (UniqueName: \"kubernetes.io/projected/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-kube-api-access-svs65\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.693659 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.693671 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e3c0b2-3994-476b-aead-29d0c1a7d7ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.738929 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.797494 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data\") pod \"4581700a-677c-4be3-b004-b53b8b4d5f42\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.797671 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle\") pod \"4581700a-677c-4be3-b004-b53b8b4d5f42\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.798004 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntrmx\" (UniqueName: \"kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx\") pod \"4581700a-677c-4be3-b004-b53b8b4d5f42\" (UID: \"4581700a-677c-4be3-b004-b53b8b4d5f42\") " Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.812948 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx" (OuterVolumeSpecName: "kube-api-access-ntrmx") pod "4581700a-677c-4be3-b004-b53b8b4d5f42" (UID: "4581700a-677c-4be3-b004-b53b8b4d5f42"). InnerVolumeSpecName "kube-api-access-ntrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.842037 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4581700a-677c-4be3-b004-b53b8b4d5f42" (UID: "4581700a-677c-4be3-b004-b53b8b4d5f42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.887917 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data" (OuterVolumeSpecName: "config-data") pod "4581700a-677c-4be3-b004-b53b8b4d5f42" (UID: "4581700a-677c-4be3-b004-b53b8b4d5f42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.901242 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntrmx\" (UniqueName: \"kubernetes.io/projected/4581700a-677c-4be3-b004-b53b8b4d5f42-kube-api-access-ntrmx\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.901281 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:59 crc kubenswrapper[4895]: I1202 08:58:59.901292 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4581700a-677c-4be3-b004-b53b8b4d5f42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.042757 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"76e3c0b2-3994-476b-aead-29d0c1a7d7ce","Type":"ContainerDied","Data":"ed3eb02a7c2ac8de716f6454be97406e6ca225e61aeb7e863e298edb937e7f0b"} Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.042838 4895 scope.go:117] "RemoveContainer" containerID="7e9d737b175822cdc45bd8e52baeb99570dfc11e6cd8cd418de28a8a4ee982c7" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.042971 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.059187 4895 generic.go:334] "Generic (PLEG): container finished" podID="4581700a-677c-4be3-b004-b53b8b4d5f42" containerID="9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952" exitCode=0 Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.059245 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4581700a-677c-4be3-b004-b53b8b4d5f42","Type":"ContainerDied","Data":"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952"} Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.059282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4581700a-677c-4be3-b004-b53b8b4d5f42","Type":"ContainerDied","Data":"24ea3c73248f60453788a9aab0370f25558ad7f6ea7958fc301ee41f35020fd6"} Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.059345 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.101583 4895 scope.go:117] "RemoveContainer" containerID="9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.112958 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.133840 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.152797 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.154868 4895 scope.go:117] "RemoveContainer" containerID="9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952" Dec 02 08:59:00 crc kubenswrapper[4895]: E1202 08:59:00.159003 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952\": container with ID starting with 9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952 not found: ID does not exist" containerID="9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.159079 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952"} err="failed to get container status \"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952\": rpc error: code = NotFound desc = could not find container \"9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952\": container with ID starting with 9b134a5d60406bb0566f1fcda9c18ad274584701d44af124bfce3f30ba0fd952 not found: ID does not exist" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.163872 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.173679 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: E1202 08:59:00.174160 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4581700a-677c-4be3-b004-b53b8b4d5f42" containerName="nova-scheduler-scheduler" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174188 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4581700a-677c-4be3-b004-b53b8b4d5f42" containerName="nova-scheduler-scheduler" Dec 02 08:59:00 crc kubenswrapper[4895]: E1202 08:59:00.174198 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="dnsmasq-dns" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174209 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="dnsmasq-dns" Dec 02 08:59:00 crc kubenswrapper[4895]: E1202 08:59:00.174234 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174242 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:59:00 crc kubenswrapper[4895]: E1202 08:59:00.174266 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="init" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174275 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="init" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174506 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb158117-fd9d-4a7a-9bbb-68ae8d292a55" containerName="dnsmasq-dns" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174539 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.174555 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4581700a-677c-4be3-b004-b53b8b4d5f42" containerName="nova-scheduler-scheduler" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.175450 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.179524 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.185561 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.188028 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.192278 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.192863 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.209574 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348057 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4sl\" (UniqueName: \"kubernetes.io/projected/916b9d2f-42d2-4468-98b5-de64da9af5fc-kube-api-access-7f4sl\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348327 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzsk9\" (UniqueName: \"kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348348 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348493 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.348516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.450429 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.450536 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzsk9\" (UniqueName: \"kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.450567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.451260 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.451296 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.451907 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4sl\" (UniqueName: \"kubernetes.io/projected/916b9d2f-42d2-4468-98b5-de64da9af5fc-kube-api-access-7f4sl\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.454180 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.454787 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.454970 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.466599 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916b9d2f-42d2-4468-98b5-de64da9af5fc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.469426 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4sl\" (UniqueName: \"kubernetes.io/projected/916b9d2f-42d2-4468-98b5-de64da9af5fc-kube-api-access-7f4sl\") pod \"nova-cell1-novncproxy-0\" (UID: \"916b9d2f-42d2-4468-98b5-de64da9af5fc\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.469938 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzsk9\" (UniqueName: \"kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9\") pod \"nova-scheduler-0\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " pod="openstack/nova-scheduler-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.512300 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:00 crc kubenswrapper[4895]: I1202 08:59:00.530337 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.020174 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:59:01 crc kubenswrapper[4895]: W1202 08:59:01.036223 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod916b9d2f_42d2_4468_98b5_de64da9af5fc.slice/crio-75910e4a9cf2c217f12b5df418d16a865bda686277f2e0f9827a39e872c9d54a WatchSource:0}: Error finding container 75910e4a9cf2c217f12b5df418d16a865bda686277f2e0f9827a39e872c9d54a: Status 404 returned error can't find the container with id 75910e4a9cf2c217f12b5df418d16a865bda686277f2e0f9827a39e872c9d54a Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.072405 4895 generic.go:334] "Generic (PLEG): container finished" podID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" containerID="a4b6862707d31377831d1999c1f773f9f1687d41cbe8a19027aa93249bfde329" exitCode=0 Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.072487 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b","Type":"ContainerDied","Data":"a4b6862707d31377831d1999c1f773f9f1687d41cbe8a19027aa93249bfde329"} Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.081127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"916b9d2f-42d2-4468-98b5-de64da9af5fc","Type":"ContainerStarted","Data":"75910e4a9cf2c217f12b5df418d16a865bda686277f2e0f9827a39e872c9d54a"} Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.108190 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:59:01 crc kubenswrapper[4895]: W1202 08:59:01.117332 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod807ab313_d84c_4059_aa53_4d99c8c65192.slice/crio-be41583932ef51fc53c34a9b064fb3197e17e33c6119b7d48f9a787e58c0d4bf WatchSource:0}: Error finding container be41583932ef51fc53c34a9b064fb3197e17e33c6119b7d48f9a787e58c0d4bf: Status 404 returned error can't find the container with id be41583932ef51fc53c34a9b064fb3197e17e33c6119b7d48f9a787e58c0d4bf Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.159364 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4581700a-677c-4be3-b004-b53b8b4d5f42" path="/var/lib/kubelet/pods/4581700a-677c-4be3-b004-b53b8b4d5f42/volumes" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.161036 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e3c0b2-3994-476b-aead-29d0c1a7d7ce" path="/var/lib/kubelet/pods/76e3c0b2-3994-476b-aead-29d0c1a7d7ce/volumes" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.246978 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.372232 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle\") pod \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.372700 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jt5p\" (UniqueName: \"kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p\") pod \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.372770 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data\") pod \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\" (UID: \"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b\") " Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.377730 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p" (OuterVolumeSpecName: "kube-api-access-7jt5p") pod "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" (UID: "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b"). InnerVolumeSpecName "kube-api-access-7jt5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.399427 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" (UID: "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.402214 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data" (OuterVolumeSpecName: "config-data") pod "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" (UID: "9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.475242 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.475286 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.475304 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jt5p\" (UniqueName: \"kubernetes.io/projected/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b-kube-api-access-7jt5p\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.570535 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": read tcp 10.217.0.2:34606->10.217.1.70:8774: read: connection reset by peer" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.570655 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": read tcp 10.217.0.2:34614->10.217.1.70:8774: read: connection reset by peer" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.623623 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.623860 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="cc4ee3e5-734a-43c6-86b5-8779253d5857" containerName="nova-cell1-conductor-conductor" containerID="cri-o://2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b" gracePeriod=30 Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.795222 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:38562->10.217.1.69:8775: read: connection reset by peer" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.795543 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:38554->10.217.1.69:8775: read: connection reset by peer" Dec 02 08:59:01 crc kubenswrapper[4895]: I1202 08:59:01.922539 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.099842 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b","Type":"ContainerDied","Data":"390644f43272f26b31e3fe7c106c02342de67361ff24afb4bb8144ba9a3d471e"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.100248 4895 scope.go:117] "RemoveContainer" containerID="a4b6862707d31377831d1999c1f773f9f1687d41cbe8a19027aa93249bfde329" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.099854 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.101712 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data\") pod \"2c12acf7-543e-4b73-b873-e7ac86ad3471\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.101777 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs\") pod \"2c12acf7-543e-4b73-b873-e7ac86ad3471\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.101919 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle\") pod \"2c12acf7-543e-4b73-b873-e7ac86ad3471\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.101972 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmrh9\" (UniqueName: \"kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9\") pod \"2c12acf7-543e-4b73-b873-e7ac86ad3471\" (UID: \"2c12acf7-543e-4b73-b873-e7ac86ad3471\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.103873 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs" (OuterVolumeSpecName: "logs") pod "2c12acf7-543e-4b73-b873-e7ac86ad3471" (UID: "2c12acf7-543e-4b73-b873-e7ac86ad3471"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.116964 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9" (OuterVolumeSpecName: "kube-api-access-fmrh9") pod "2c12acf7-543e-4b73-b873-e7ac86ad3471" (UID: "2c12acf7-543e-4b73-b873-e7ac86ad3471"). InnerVolumeSpecName "kube-api-access-fmrh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.137055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"916b9d2f-42d2-4468-98b5-de64da9af5fc","Type":"ContainerStarted","Data":"ec58e974d5433fc0402ef1c65f4088b34fbabeccb7d36962d94c25181ac80e6b"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.147082 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c12acf7-543e-4b73-b873-e7ac86ad3471" (UID: "2c12acf7-543e-4b73-b873-e7ac86ad3471"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.153097 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerID="288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0" exitCode=0 Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.153180 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.153247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerDied","Data":"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.153284 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c12acf7-543e-4b73-b873-e7ac86ad3471","Type":"ContainerDied","Data":"3a9cfe2fab6aed8a2019f8a7f701da99e42f28262d7ad2b5aa92ba4edc61a3f1"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.163513 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"807ab313-d84c-4059-aa53-4d99c8c65192","Type":"ContainerStarted","Data":"afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.163573 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"807ab313-d84c-4059-aa53-4d99c8c65192","Type":"ContainerStarted","Data":"be41583932ef51fc53c34a9b064fb3197e17e33c6119b7d48f9a787e58c0d4bf"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.167312 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.167293553 podStartE2EDuration="2.167293553s" podCreationTimestamp="2025-12-02 08:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:02.158555431 +0000 UTC m=+5753.329415064" watchObservedRunningTime="2025-12-02 08:59:02.167293553 +0000 UTC m=+5753.338153176" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.167685 4895 generic.go:334] "Generic (PLEG): container finished" podID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerID="347642b4e949226d454594ff3f8477f25e57b0b2c4ed371a8a76046b238c8a5a" exitCode=0 Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.167731 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerDied","Data":"347642b4e949226d454594ff3f8477f25e57b0b2c4ed371a8a76046b238c8a5a"} Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.191004 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data" (OuterVolumeSpecName: "config-data") pod "2c12acf7-543e-4b73-b873-e7ac86ad3471" (UID: "2c12acf7-543e-4b73-b873-e7ac86ad3471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.205308 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmrh9\" (UniqueName: \"kubernetes.io/projected/2c12acf7-543e-4b73-b873-e7ac86ad3471-kube-api-access-fmrh9\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.205343 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.205353 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c12acf7-543e-4b73-b873-e7ac86ad3471-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.205362 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12acf7-543e-4b73-b873-e7ac86ad3471-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.221989 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.221968133 podStartE2EDuration="2.221968133s" podCreationTimestamp="2025-12-02 08:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:02.214021006 +0000 UTC m=+5753.384880629" watchObservedRunningTime="2025-12-02 08:59:02.221968133 +0000 UTC m=+5753.392827756" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.259690 4895 scope.go:117] "RemoveContainer" containerID="288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.279934 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.296381 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.311888 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.312295 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312309 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.312335 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312344 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.312357 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" containerName="nova-cell0-conductor-conductor" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312364 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" containerName="nova-cell0-conductor-conductor" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312560 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-log" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312580 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" containerName="nova-api-api" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.312602 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" containerName="nova-cell0-conductor-conductor" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.313405 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.314275 4895 scope.go:117] "RemoveContainer" containerID="5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.324133 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.334465 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.361976 4895 scope.go:117] "RemoveContainer" containerID="288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0" Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.374983 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0\": container with ID starting with 288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0 not found: ID does not exist" containerID="288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.375043 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0"} err="failed to get container status \"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0\": rpc error: code = NotFound desc = could not find container \"288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0\": container with ID starting with 288d48d3ee50f6f76b000d7f0fdc3530e8cb8da4f539b53aca09276d2c1b02e0 not found: ID does not exist" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.375091 4895 scope.go:117] "RemoveContainer" containerID="5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01" Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.375977 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01\": container with ID starting with 5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01 not found: ID does not exist" containerID="5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.376013 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01"} err="failed to get container status \"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01\": rpc error: code = NotFound desc = could not find container \"5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01\": container with ID starting with 5e2275ac273a05e40f70400e8a866ffeb0ec30b5749cbb8ed8117b0bb37e9c01 not found: ID does not exist" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.410260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.410433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.410642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjplv\" (UniqueName: \"kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.463541 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.514918 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.515001 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjplv\" (UniqueName: \"kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.515094 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.532529 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.533104 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.538372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjplv\" (UniqueName: \"kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv\") pod \"nova-cell0-conductor-0\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.572629 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.589940 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.598935 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.599461 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.599480 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" Dec 02 08:59:02 crc kubenswrapper[4895]: E1202 08:59:02.599508 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.599516 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.599811 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-metadata" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.599838 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" containerName="nova-metadata-log" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.600953 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.604237 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.606201 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.619359 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle\") pod \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.619496 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs\") pod \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.619598 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz7jn\" (UniqueName: \"kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn\") pod \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.619662 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data\") pod \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\" (UID: \"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db\") " Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.622668 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs" (OuterVolumeSpecName: "logs") pod "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" (UID: "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.628389 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn" (OuterVolumeSpecName: "kube-api-access-zz7jn") pod "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" (UID: "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db"). InnerVolumeSpecName "kube-api-access-zz7jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.643404 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.653407 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" (UID: "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.667246 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data" (OuterVolumeSpecName: "config-data") pod "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" (UID: "6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.725945 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.725988 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726011 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r58l\" (UniqueName: \"kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726279 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726292 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726302 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz7jn\" (UniqueName: \"kubernetes.io/projected/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-kube-api-access-zz7jn\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.726311 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.829456 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r58l\" (UniqueName: \"kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.830274 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.830300 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.830343 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.832027 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.837332 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.839353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.850683 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r58l\" (UniqueName: \"kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l\") pod \"nova-api-0\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " pod="openstack/nova-api-0" Dec 02 08:59:02 crc kubenswrapper[4895]: I1202 08:59:02.979065 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.161397 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c12acf7-543e-4b73-b873-e7ac86ad3471" path="/var/lib/kubelet/pods/2c12acf7-543e-4b73-b873-e7ac86ad3471/volumes" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.163855 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b" path="/var/lib/kubelet/pods/9b1a6c2e-ced5-4dc3-ae1a-ed6fdd7cd12b/volumes" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.168328 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: W1202 08:59:03.170923 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd34d7c_19d4_482a_aa19_6535eb26640e.slice/crio-681df1dacdccaf86f1c7fca2c564c4f347cf8eff132fa5628617ca71fa0a8379 WatchSource:0}: Error finding container 681df1dacdccaf86f1c7fca2c564c4f347cf8eff132fa5628617ca71fa0a8379: Status 404 returned error can't find the container with id 681df1dacdccaf86f1c7fca2c564c4f347cf8eff132fa5628617ca71fa0a8379 Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.199821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db","Type":"ContainerDied","Data":"3c6e1c10a9007d54d03b7f7a089173d86a6c667249ba495394a6209f673e5e0e"} Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.199874 4895 scope.go:117] "RemoveContainer" containerID="347642b4e949226d454594ff3f8477f25e57b0b2c4ed371a8a76046b238c8a5a" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.200016 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.311439 4895 scope.go:117] "RemoveContainer" containerID="d8de976055e2285d9f987e2065884fe0f2a2279b0e9d2ed374f542fbcfb8e423" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.357246 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.383734 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.402537 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.404341 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.410911 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.416453 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: W1202 08:59:03.487963 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa8c76cd_9852_45cc_82fc_c9ee472f94c2.slice/crio-da4085bc700e7413f53199ac4762b1ca913c75acf7a03e66d3a7767954baaaf4 WatchSource:0}: Error finding container da4085bc700e7413f53199ac4762b1ca913c75acf7a03e66d3a7767954baaaf4: Status 404 returned error can't find the container with id da4085bc700e7413f53199ac4762b1ca913c75acf7a03e66d3a7767954baaaf4 Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.495939 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.545520 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.545607 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kx86\" (UniqueName: \"kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.545677 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.545732 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.647804 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.647887 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kx86\" (UniqueName: \"kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.647922 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.647990 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.648941 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.655048 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.658946 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.669147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kx86\" (UniqueName: \"kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86\") pod \"nova-metadata-0\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " pod="openstack/nova-metadata-0" Dec 02 08:59:03 crc kubenswrapper[4895]: I1202 08:59:03.725842 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.220486 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdd34d7c-19d4-482a-aa19-6535eb26640e","Type":"ContainerStarted","Data":"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165"} Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.221146 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.221163 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdd34d7c-19d4-482a-aa19-6535eb26640e","Type":"ContainerStarted","Data":"681df1dacdccaf86f1c7fca2c564c4f347cf8eff132fa5628617ca71fa0a8379"} Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.224096 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerStarted","Data":"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0"} Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.224139 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerStarted","Data":"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee"} Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.224166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerStarted","Data":"da4085bc700e7413f53199ac4762b1ca913c75acf7a03e66d3a7767954baaaf4"} Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.243538 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.243512344 podStartE2EDuration="2.243512344s" podCreationTimestamp="2025-12-02 08:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:04.235614689 +0000 UTC m=+5755.406474312" watchObservedRunningTime="2025-12-02 08:59:04.243512344 +0000 UTC m=+5755.414371967" Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.276638 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.276610324 podStartE2EDuration="2.276610324s" podCreationTimestamp="2025-12-02 08:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:04.263774985 +0000 UTC m=+5755.434634618" watchObservedRunningTime="2025-12-02 08:59:04.276610324 +0000 UTC m=+5755.447469937" Dec 02 08:59:04 crc kubenswrapper[4895]: W1202 08:59:04.287025 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf364e779_d2db_4f23_bc99_1d0b91dca426.slice/crio-e5f0e2d8cd570ae1c7cd4b702575e3dd18caf316712e675b1804031ff1efc305 WatchSource:0}: Error finding container e5f0e2d8cd570ae1c7cd4b702575e3dd18caf316712e675b1804031ff1efc305: Status 404 returned error can't find the container with id e5f0e2d8cd570ae1c7cd4b702575e3dd18caf316712e675b1804031ff1efc305 Dec 02 08:59:04 crc kubenswrapper[4895]: I1202 08:59:04.287714 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.153600 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db" path="/var/lib/kubelet/pods/6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db/volumes" Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.236457 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerStarted","Data":"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378"} Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.236507 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerStarted","Data":"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a"} Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.236525 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerStarted","Data":"e5f0e2d8cd570ae1c7cd4b702575e3dd18caf316712e675b1804031ff1efc305"} Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.271781 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.271749053 podStartE2EDuration="2.271749053s" podCreationTimestamp="2025-12-02 08:59:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:05.254368302 +0000 UTC m=+5756.425227945" watchObservedRunningTime="2025-12-02 08:59:05.271749053 +0000 UTC m=+5756.442608666" Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.513268 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:05 crc kubenswrapper[4895]: I1202 08:59:05.530976 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.182781 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.245755 4895 generic.go:334] "Generic (PLEG): container finished" podID="cc4ee3e5-734a-43c6-86b5-8779253d5857" containerID="2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b" exitCode=0 Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.247232 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.247527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cc4ee3e5-734a-43c6-86b5-8779253d5857","Type":"ContainerDied","Data":"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b"} Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.247684 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cc4ee3e5-734a-43c6-86b5-8779253d5857","Type":"ContainerDied","Data":"9c072549133c766a309b1951b8178f18de784e8bfe0656a354f8676ae4a4f122"} Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.247802 4895 scope.go:117] "RemoveContainer" containerID="2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.278555 4895 scope.go:117] "RemoveContainer" containerID="2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b" Dec 02 08:59:06 crc kubenswrapper[4895]: E1202 08:59:06.279130 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b\": container with ID starting with 2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b not found: ID does not exist" containerID="2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.279191 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b"} err="failed to get container status \"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b\": rpc error: code = NotFound desc = could not find container \"2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b\": container with ID starting with 2186033d0fbed000930850f33b94c4907eba9bc7a265da6f3fe910adc160235b not found: ID does not exist" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.299181 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle\") pod \"cc4ee3e5-734a-43c6-86b5-8779253d5857\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.299431 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data\") pod \"cc4ee3e5-734a-43c6-86b5-8779253d5857\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.299477 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28psx\" (UniqueName: \"kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx\") pod \"cc4ee3e5-734a-43c6-86b5-8779253d5857\" (UID: \"cc4ee3e5-734a-43c6-86b5-8779253d5857\") " Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.305612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx" (OuterVolumeSpecName: "kube-api-access-28psx") pod "cc4ee3e5-734a-43c6-86b5-8779253d5857" (UID: "cc4ee3e5-734a-43c6-86b5-8779253d5857"). InnerVolumeSpecName "kube-api-access-28psx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.330445 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data" (OuterVolumeSpecName: "config-data") pod "cc4ee3e5-734a-43c6-86b5-8779253d5857" (UID: "cc4ee3e5-734a-43c6-86b5-8779253d5857"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.330492 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc4ee3e5-734a-43c6-86b5-8779253d5857" (UID: "cc4ee3e5-734a-43c6-86b5-8779253d5857"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.401961 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.401998 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc4ee3e5-734a-43c6-86b5-8779253d5857-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.402007 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28psx\" (UniqueName: \"kubernetes.io/projected/cc4ee3e5-734a-43c6-86b5-8779253d5857-kube-api-access-28psx\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.584200 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.595408 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.609253 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:06 crc kubenswrapper[4895]: E1202 08:59:06.609788 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4ee3e5-734a-43c6-86b5-8779253d5857" containerName="nova-cell1-conductor-conductor" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.609807 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4ee3e5-734a-43c6-86b5-8779253d5857" containerName="nova-cell1-conductor-conductor" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.610058 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4ee3e5-734a-43c6-86b5-8779253d5857" containerName="nova-cell1-conductor-conductor" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.612246 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.617918 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.627527 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.707966 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.708130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.708174 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wsft\" (UniqueName: \"kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.810119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.810207 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wsft\" (UniqueName: \"kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.810282 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.820961 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.823823 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.844419 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wsft\" (UniqueName: \"kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft\") pod \"nova-cell1-conductor-0\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:06 crc kubenswrapper[4895]: I1202 08:59:06.932789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:07 crc kubenswrapper[4895]: I1202 08:59:07.154591 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4ee3e5-734a-43c6-86b5-8779253d5857" path="/var/lib/kubelet/pods/cc4ee3e5-734a-43c6-86b5-8779253d5857/volumes" Dec 02 08:59:07 crc kubenswrapper[4895]: I1202 08:59:07.411570 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.266609 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23d0ddc1-b566-4537-ac82-544ff5e098f3","Type":"ContainerStarted","Data":"a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5"} Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.266649 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23d0ddc1-b566-4537-ac82-544ff5e098f3","Type":"ContainerStarted","Data":"e7c782982b25b7900359708bc99c24aa774cee3a6981286caa4574be111b85e6"} Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.267196 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.291389 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.291360764 podStartE2EDuration="2.291360764s" podCreationTimestamp="2025-12-02 08:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:08.286310747 +0000 UTC m=+5759.457170370" watchObservedRunningTime="2025-12-02 08:59:08.291360764 +0000 UTC m=+5759.462220387" Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.726640 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:59:08 crc kubenswrapper[4895]: I1202 08:59:08.726691 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:59:10 crc kubenswrapper[4895]: I1202 08:59:10.512570 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:10 crc kubenswrapper[4895]: I1202 08:59:10.523346 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:10 crc kubenswrapper[4895]: I1202 08:59:10.531289 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:59:10 crc kubenswrapper[4895]: I1202 08:59:10.563905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:59:11 crc kubenswrapper[4895]: I1202 08:59:11.141709 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:59:11 crc kubenswrapper[4895]: E1202 08:59:11.142368 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:59:11 crc kubenswrapper[4895]: I1202 08:59:11.306024 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:59:11 crc kubenswrapper[4895]: I1202 08:59:11.325166 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:59:12 crc kubenswrapper[4895]: I1202 08:59:12.683293 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 08:59:12 crc kubenswrapper[4895]: I1202 08:59:12.981250 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:59:12 crc kubenswrapper[4895]: I1202 08:59:12.981295 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:59:13 crc kubenswrapper[4895]: I1202 08:59:13.727210 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:59:13 crc kubenswrapper[4895]: I1202 08:59:13.727282 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:59:14 crc kubenswrapper[4895]: I1202 08:59:14.064924 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:59:14 crc kubenswrapper[4895]: I1202 08:59:14.064962 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:59:14 crc kubenswrapper[4895]: I1202 08:59:14.808910 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:59:14 crc kubenswrapper[4895]: I1202 08:59:14.808921 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:59:16 crc kubenswrapper[4895]: I1202 08:59:16.964384 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.223204 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.224943 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.227149 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.248894 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351583 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351634 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351848 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351903 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351946 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.351974 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjft\" (UniqueName: \"kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454232 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454310 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454354 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454374 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjft\" (UniqueName: \"kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454477 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454506 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.454983 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.461761 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.461925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.463517 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.469462 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.474680 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjft\" (UniqueName: \"kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft\") pod \"cinder-scheduler-0\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:17 crc kubenswrapper[4895]: I1202 08:59:17.547784 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:18 crc kubenswrapper[4895]: I1202 08:59:18.199337 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:18 crc kubenswrapper[4895]: I1202 08:59:18.385318 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerStarted","Data":"ef6392082e4a3b14e74ad59288851a91115f0b9fcd6a10e379cf27bcddd093e3"} Dec 02 08:59:18 crc kubenswrapper[4895]: I1202 08:59:18.697615 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:18 crc kubenswrapper[4895]: I1202 08:59:18.697904 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api-log" containerID="cri-o://47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67" gracePeriod=30 Dec 02 08:59:18 crc kubenswrapper[4895]: I1202 08:59:18.698035 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api" containerID="cri-o://067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9" gracePeriod=30 Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.258615 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.262131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.266133 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.270899 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395292 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395516 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395602 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395655 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.395832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdhhj\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-kube-api-access-tdhhj\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396125 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396177 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396244 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396400 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396548 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396588 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396612 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-run\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396681 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.396873 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.401126 4895 generic.go:334] "Generic (PLEG): container finished" podID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerID="47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67" exitCode=143 Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.401238 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerDied","Data":"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67"} Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.403605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerStarted","Data":"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c"} Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499553 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499666 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499689 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-run\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499771 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499843 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499902 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499932 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499963 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.499995 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdhhj\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-kube-api-access-tdhhj\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500059 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500160 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500505 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500562 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500585 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500604 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-run\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.500962 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.501129 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.501460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.501569 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.501625 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.501817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/11cd078b-e238-400e-bf0d-53e7ed0b848b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.505064 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.506107 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.506760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.507373 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.508077 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11cd078b-e238-400e-bf0d-53e7ed0b848b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.529049 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdhhj\" (UniqueName: \"kubernetes.io/projected/11cd078b-e238-400e-bf0d-53e7ed0b848b-kube-api-access-tdhhj\") pod \"cinder-volume-volume1-0\" (UID: \"11cd078b-e238-400e-bf0d-53e7ed0b848b\") " pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.601641 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.865991 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.867789 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.871382 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 02 08:59:19 crc kubenswrapper[4895]: I1202 08:59:19.907755 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.009366 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfc6v\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-kube-api-access-hfc6v\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.009889 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.009948 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-dev\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010019 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-run\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010222 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-ceph\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010291 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-sys\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010350 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010378 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010412 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010670 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010757 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010786 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010838 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010879 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-scripts\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.010965 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112319 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112389 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112413 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112443 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-scripts\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112505 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112542 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfc6v\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-kube-api-access-hfc6v\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112577 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-lib-modules\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112588 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112639 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-dev\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-run\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112917 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-ceph\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112723 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112935 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112860 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-run\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112944 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-sys\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113131 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113166 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113227 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112812 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-dev\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113346 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.112974 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-sys\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113790 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.113828 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0d4fda23-3c3c-435b-860c-0973feb1e664-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.121008 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-ceph\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.121041 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.121783 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-scripts\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.122124 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-config-data\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.122298 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4fda23-3c3c-435b-860c-0973feb1e664-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.139698 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfc6v\" (UniqueName: \"kubernetes.io/projected/0d4fda23-3c3c-435b-860c-0973feb1e664-kube-api-access-hfc6v\") pod \"cinder-backup-0\" (UID: \"0d4fda23-3c3c-435b-860c-0973feb1e664\") " pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.198394 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 02 08:59:20 crc kubenswrapper[4895]: W1202 08:59:20.199852 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11cd078b_e238_400e_bf0d_53e7ed0b848b.slice/crio-b252a84d0b18e06b63445228a676070d3176533df67528c05a87ff062de16ec7 WatchSource:0}: Error finding container b252a84d0b18e06b63445228a676070d3176533df67528c05a87ff062de16ec7: Status 404 returned error can't find the container with id b252a84d0b18e06b63445228a676070d3176533df67528c05a87ff062de16ec7 Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.206485 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.418143 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"11cd078b-e238-400e-bf0d-53e7ed0b848b","Type":"ContainerStarted","Data":"b252a84d0b18e06b63445228a676070d3176533df67528c05a87ff062de16ec7"} Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.423191 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerStarted","Data":"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee"} Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.456064 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4560407189999998 podStartE2EDuration="3.456040719s" podCreationTimestamp="2025-12-02 08:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:20.444222362 +0000 UTC m=+5771.615081975" watchObservedRunningTime="2025-12-02 08:59:20.456040719 +0000 UTC m=+5771.626900342" Dec 02 08:59:20 crc kubenswrapper[4895]: I1202 08:59:20.797051 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 02 08:59:21 crc kubenswrapper[4895]: I1202 08:59:21.434074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d4fda23-3c3c-435b-860c-0973feb1e664","Type":"ContainerStarted","Data":"44a74fbd691eac17922b8abe5ab4c2d83f9a52492f04cb26491ea083a99d18e7"} Dec 02 08:59:21 crc kubenswrapper[4895]: I1202 08:59:21.906009 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": dial tcp 10.217.1.78:8776: connect: connection refused" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.313099 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.453510 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"11cd078b-e238-400e-bf0d-53e7ed0b848b","Type":"ContainerStarted","Data":"db01263b171b31852326ae8d6a19e5ce8deeeca306955cce65a5973c88a53415"} Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.454101 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"11cd078b-e238-400e-bf0d-53e7ed0b848b","Type":"ContainerStarted","Data":"b99c9f469e2e8300cfb18d39583c5d8ec57c03adc56f44e7e6d9e3f3e19ab5ea"} Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.457679 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d4fda23-3c3c-435b-860c-0973feb1e664","Type":"ContainerStarted","Data":"3383f65e809ef07f05ee55462a6ec736e0313570ccc527a62430e1607f7bb60f"} Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.460345 4895 generic.go:334] "Generic (PLEG): container finished" podID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerID="067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9" exitCode=0 Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.460404 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerDied","Data":"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9"} Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.460439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1","Type":"ContainerDied","Data":"818919f127e8ef487f4db6c6286e0624b087cc79f98ad6a6633aa28751a12dd2"} Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.460463 4895 scope.go:117] "RemoveContainer" containerID="067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.460685 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479311 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479371 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479475 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479518 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479561 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479615 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wfkh\" (UniqueName: \"kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.479666 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts\") pod \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\" (UID: \"2e6bdac4-cbc7-4a33-a2f8-95f346673ee1\") " Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.486486 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs" (OuterVolumeSpecName: "logs") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.486492 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.490076 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts" (OuterVolumeSpecName: "scripts") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.493115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh" (OuterVolumeSpecName: "kube-api-access-7wfkh") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "kube-api-access-7wfkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.504897 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.507162 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.9726223109999999 podStartE2EDuration="3.50714138s" podCreationTimestamp="2025-12-02 08:59:19 +0000 UTC" firstStartedPulling="2025-12-02 08:59:20.202021047 +0000 UTC m=+5771.372880660" lastFinishedPulling="2025-12-02 08:59:21.736540116 +0000 UTC m=+5772.907399729" observedRunningTime="2025-12-02 08:59:22.489077928 +0000 UTC m=+5773.659937541" watchObservedRunningTime="2025-12-02 08:59:22.50714138 +0000 UTC m=+5773.678000993" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.507591 4895 scope.go:117] "RemoveContainer" containerID="47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.534374 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.539121 4895 scope.go:117] "RemoveContainer" containerID="067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9" Dec 02 08:59:22 crc kubenswrapper[4895]: E1202 08:59:22.539529 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9\": container with ID starting with 067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9 not found: ID does not exist" containerID="067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.539631 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9"} err="failed to get container status \"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9\": rpc error: code = NotFound desc = could not find container \"067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9\": container with ID starting with 067db0a6ec45939efcf4939fef342452de0c1d2b83c3324bd79dc142bc5bc1f9 not found: ID does not exist" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.539704 4895 scope.go:117] "RemoveContainer" containerID="47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67" Dec 02 08:59:22 crc kubenswrapper[4895]: E1202 08:59:22.540984 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67\": container with ID starting with 47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67 not found: ID does not exist" containerID="47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.541024 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67"} err="failed to get container status \"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67\": rpc error: code = NotFound desc = could not find container \"47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67\": container with ID starting with 47ddf22839b4266b041a95432010db939e597716018941d65c46f0b2b19daf67 not found: ID does not exist" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.548408 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data" (OuterVolumeSpecName: "config-data") pod "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" (UID: "2e6bdac4-cbc7-4a33-a2f8-95f346673ee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.548905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582106 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582135 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wfkh\" (UniqueName: \"kubernetes.io/projected/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-kube-api-access-7wfkh\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582149 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582159 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582167 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582176 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.582184 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.857616 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.879565 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.896230 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:22 crc kubenswrapper[4895]: E1202 08:59:22.897350 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.897375 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api" Dec 02 08:59:22 crc kubenswrapper[4895]: E1202 08:59:22.897394 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api-log" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.897402 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api-log" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.897632 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api-log" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.897648 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" containerName="cinder-api" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.900393 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.908057 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.920081 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.985282 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.986814 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.987229 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.989660 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993090 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fd9be26-fa07-4fd7-8723-f7e4121680d1-logs\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993118 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993182 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd9be26-fa07-4fd7-8723-f7e4121680d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993233 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-scripts\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993262 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667bh\" (UniqueName: \"kubernetes.io/projected/0fd9be26-fa07-4fd7-8723-f7e4121680d1-kube-api-access-667bh\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:22 crc kubenswrapper[4895]: I1202 08:59:22.993297 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.094806 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.094927 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fd9be26-fa07-4fd7-8723-f7e4121680d1-logs\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.094954 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095037 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd9be26-fa07-4fd7-8723-f7e4121680d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095170 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fd9be26-fa07-4fd7-8723-f7e4121680d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-scripts\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095235 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-667bh\" (UniqueName: \"kubernetes.io/projected/0fd9be26-fa07-4fd7-8723-f7e4121680d1-kube-api-access-667bh\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095559 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.095905 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fd9be26-fa07-4fd7-8723-f7e4121680d1-logs\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.101456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.103776 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.104810 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-config-data\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.107165 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fd9be26-fa07-4fd7-8723-f7e4121680d1-scripts\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.112124 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-667bh\" (UniqueName: \"kubernetes.io/projected/0fd9be26-fa07-4fd7-8723-f7e4121680d1-kube-api-access-667bh\") pod \"cinder-api-0\" (UID: \"0fd9be26-fa07-4fd7-8723-f7e4121680d1\") " pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.157453 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6bdac4-cbc7-4a33-a2f8-95f346673ee1" path="/var/lib/kubelet/pods/2e6bdac4-cbc7-4a33-a2f8-95f346673ee1/volumes" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.237361 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.481121 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0d4fda23-3c3c-435b-860c-0973feb1e664","Type":"ContainerStarted","Data":"407c51c7279a803f1c2b723068b77876937fe361b56c1ea7d92a9132106154f0"} Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.482227 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.490013 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.515972 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.156219312 podStartE2EDuration="4.515935163s" podCreationTimestamp="2025-12-02 08:59:19 +0000 UTC" firstStartedPulling="2025-12-02 08:59:20.798819364 +0000 UTC m=+5771.969678977" lastFinishedPulling="2025-12-02 08:59:22.158535215 +0000 UTC m=+5773.329394828" observedRunningTime="2025-12-02 08:59:23.508775811 +0000 UTC m=+5774.679635444" watchObservedRunningTime="2025-12-02 08:59:23.515935163 +0000 UTC m=+5774.686794796" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.735912 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.735990 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.736956 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.747056 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:59:23 crc kubenswrapper[4895]: I1202 08:59:23.747158 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:59:24 crc kubenswrapper[4895]: I1202 08:59:24.495914 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0fd9be26-fa07-4fd7-8723-f7e4121680d1","Type":"ContainerStarted","Data":"5066f737c437a510c8a1d1ff4dd9a39145f636ee8e37b0092e254026727d8839"} Dec 02 08:59:24 crc kubenswrapper[4895]: I1202 08:59:24.496594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0fd9be26-fa07-4fd7-8723-f7e4121680d1","Type":"ContainerStarted","Data":"ab549f4b9df2bc8d39cef0311fac6eff2a590e4f7b88fe8f13ff35dd77128a8f"} Dec 02 08:59:24 crc kubenswrapper[4895]: I1202 08:59:24.602030 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:25 crc kubenswrapper[4895]: I1202 08:59:25.142005 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:59:25 crc kubenswrapper[4895]: E1202 08:59:25.142696 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:59:25 crc kubenswrapper[4895]: I1202 08:59:25.206760 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 02 08:59:25 crc kubenswrapper[4895]: I1202 08:59:25.505836 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0fd9be26-fa07-4fd7-8723-f7e4121680d1","Type":"ContainerStarted","Data":"6c34abc7a867cfb86c57415a2c1aa860b8833f459dacd6cf6338dc7127910cc4"} Dec 02 08:59:25 crc kubenswrapper[4895]: I1202 08:59:25.539219 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.539201718 podStartE2EDuration="3.539201718s" podCreationTimestamp="2025-12-02 08:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:25.5299569 +0000 UTC m=+5776.700816523" watchObservedRunningTime="2025-12-02 08:59:25.539201718 +0000 UTC m=+5776.710061321" Dec 02 08:59:26 crc kubenswrapper[4895]: I1202 08:59:26.516212 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 08:59:27 crc kubenswrapper[4895]: I1202 08:59:27.747660 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 08:59:27 crc kubenswrapper[4895]: I1202 08:59:27.804714 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:28 crc kubenswrapper[4895]: I1202 08:59:28.545044 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="cinder-scheduler" containerID="cri-o://b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" gracePeriod=30 Dec 02 08:59:28 crc kubenswrapper[4895]: I1202 08:59:28.545249 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="probe" containerID="cri-o://b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" gracePeriod=30 Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.514659 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559456 4895 generic.go:334] "Generic (PLEG): container finished" podID="95577834-67b9-4194-9e7b-6377d2f6b603" containerID="b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" exitCode=0 Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559490 4895 generic.go:334] "Generic (PLEG): container finished" podID="95577834-67b9-4194-9e7b-6377d2f6b603" containerID="b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" exitCode=0 Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559514 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerDied","Data":"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee"} Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerDied","Data":"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c"} Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559554 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95577834-67b9-4194-9e7b-6377d2f6b603","Type":"ContainerDied","Data":"ef6392082e4a3b14e74ad59288851a91115f0b9fcd6a10e379cf27bcddd093e3"} Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559571 4895 scope.go:117] "RemoveContainer" containerID="b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.559583 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.593896 4895 scope.go:117] "RemoveContainer" containerID="b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.617041 4895 scope.go:117] "RemoveContainer" containerID="b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" Dec 02 08:59:29 crc kubenswrapper[4895]: E1202 08:59:29.617620 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee\": container with ID starting with b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee not found: ID does not exist" containerID="b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.617685 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee"} err="failed to get container status \"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee\": rpc error: code = NotFound desc = could not find container \"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee\": container with ID starting with b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee not found: ID does not exist" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.617730 4895 scope.go:117] "RemoveContainer" containerID="b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" Dec 02 08:59:29 crc kubenswrapper[4895]: E1202 08:59:29.618475 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c\": container with ID starting with b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c not found: ID does not exist" containerID="b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.618523 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c"} err="failed to get container status \"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c\": rpc error: code = NotFound desc = could not find container \"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c\": container with ID starting with b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c not found: ID does not exist" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.618570 4895 scope.go:117] "RemoveContainer" containerID="b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.619086 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee"} err="failed to get container status \"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee\": rpc error: code = NotFound desc = could not find container \"b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee\": container with ID starting with b103158c3e9569ee13031ea0f1a10cdbcd12eb0e60453e7185d5a5c86af6e4ee not found: ID does not exist" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.619153 4895 scope.go:117] "RemoveContainer" containerID="b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.619661 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c"} err="failed to get container status \"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c\": rpc error: code = NotFound desc = could not find container \"b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c\": container with ID starting with b35b003b563c5a94050979dce6ad29ded8b3ea3c3aa4e959f6fb346680e7099c not found: ID does not exist" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642267 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642403 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjft\" (UniqueName: \"kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642505 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642606 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642651 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts\") pod \"95577834-67b9-4194-9e7b-6377d2f6b603\" (UID: \"95577834-67b9-4194-9e7b-6377d2f6b603\") " Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.642649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.643085 4895 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95577834-67b9-4194-9e7b-6377d2f6b603-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.659002 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts" (OuterVolumeSpecName: "scripts") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.660837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft" (OuterVolumeSpecName: "kube-api-access-qqjft") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "kube-api-access-qqjft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.666018 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.695100 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.745627 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.745671 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjft\" (UniqueName: \"kubernetes.io/projected/95577834-67b9-4194-9e7b-6377d2f6b603-kube-api-access-qqjft\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.745684 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.745694 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.756141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data" (OuterVolumeSpecName: "config-data") pod "95577834-67b9-4194-9e7b-6377d2f6b603" (UID: "95577834-67b9-4194-9e7b-6377d2f6b603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.848557 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95577834-67b9-4194-9e7b-6377d2f6b603-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.876148 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.901092 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.913512 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.930275 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:29 crc kubenswrapper[4895]: E1202 08:59:29.930831 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="cinder-scheduler" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.930854 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="cinder-scheduler" Dec 02 08:59:29 crc kubenswrapper[4895]: E1202 08:59:29.930873 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="probe" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.930882 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="probe" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.931157 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="cinder-scheduler" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.931188 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" containerName="probe" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.932571 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.936761 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 08:59:29 crc kubenswrapper[4895]: I1202 08:59:29.949615 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053333 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053387 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vv2l\" (UniqueName: \"kubernetes.io/projected/b78042eb-5d7c-4630-83e1-0f722cfde766-kube-api-access-6vv2l\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053422 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-scripts\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053496 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053529 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b78042eb-5d7c-4630-83e1-0f722cfde766-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.053569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156335 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-scripts\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156430 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156463 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b78042eb-5d7c-4630-83e1-0f722cfde766-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156616 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.156638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vv2l\" (UniqueName: \"kubernetes.io/projected/b78042eb-5d7c-4630-83e1-0f722cfde766-kube-api-access-6vv2l\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.157956 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b78042eb-5d7c-4630-83e1-0f722cfde766-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.160053 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-scripts\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.161119 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.165254 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.172513 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78042eb-5d7c-4630-83e1-0f722cfde766-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.175614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vv2l\" (UniqueName: \"kubernetes.io/projected/b78042eb-5d7c-4630-83e1-0f722cfde766-kube-api-access-6vv2l\") pod \"cinder-scheduler-0\" (UID: \"b78042eb-5d7c-4630-83e1-0f722cfde766\") " pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.251957 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.480993 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 02 08:59:30 crc kubenswrapper[4895]: I1202 08:59:30.790107 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:59:31 crc kubenswrapper[4895]: I1202 08:59:31.157418 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95577834-67b9-4194-9e7b-6377d2f6b603" path="/var/lib/kubelet/pods/95577834-67b9-4194-9e7b-6377d2f6b603/volumes" Dec 02 08:59:31 crc kubenswrapper[4895]: I1202 08:59:31.585345 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b78042eb-5d7c-4630-83e1-0f722cfde766","Type":"ContainerStarted","Data":"fe067776219b7fc7098de54d6e4b0d60c735e2c6cc1f2f9831012c5e31a4fe91"} Dec 02 08:59:31 crc kubenswrapper[4895]: I1202 08:59:31.585781 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b78042eb-5d7c-4630-83e1-0f722cfde766","Type":"ContainerStarted","Data":"4e97fd7039a8c833ae034556396107edfbe709ffbafef33e4d24e43e0e4c2d91"} Dec 02 08:59:32 crc kubenswrapper[4895]: I1202 08:59:32.596532 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b78042eb-5d7c-4630-83e1-0f722cfde766","Type":"ContainerStarted","Data":"ae23b2f3261e9cfa1f3aac3b384b1bae046f9a5a7da9d732b8dc0530595addba"} Dec 02 08:59:32 crc kubenswrapper[4895]: I1202 08:59:32.622777 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.6227252070000002 podStartE2EDuration="3.622725207s" podCreationTimestamp="2025-12-02 08:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:59:32.613502011 +0000 UTC m=+5783.784361624" watchObservedRunningTime="2025-12-02 08:59:32.622725207 +0000 UTC m=+5783.793584820" Dec 02 08:59:33 crc kubenswrapper[4895]: I1202 08:59:33.213248 4895 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6ba0fc5d-edf7-4eac-b2a3-0a2fa14ea6db] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6ba0fc5d_edf7_4eac_b2a3_0a2fa14ea6db.slice" Dec 02 08:59:35 crc kubenswrapper[4895]: I1202 08:59:35.122941 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 08:59:35 crc kubenswrapper[4895]: I1202 08:59:35.252696 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 08:59:36 crc kubenswrapper[4895]: I1202 08:59:36.141294 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:59:36 crc kubenswrapper[4895]: E1202 08:59:36.141896 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 08:59:40 crc kubenswrapper[4895]: I1202 08:59:40.446295 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 08:59:51 crc kubenswrapper[4895]: I1202 08:59:51.140962 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 08:59:51 crc kubenswrapper[4895]: E1202 08:59:51.141878 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.149027 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w"] Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.151239 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.153875 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.154180 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.157403 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w"] Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.276732 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6mb\" (UniqueName: \"kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.277565 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.277705 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.379507 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.379586 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.379624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6mb\" (UniqueName: \"kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.381253 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.392878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.411844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6mb\" (UniqueName: \"kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb\") pod \"collect-profiles-29411100-qvr2w\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.479960 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:00 crc kubenswrapper[4895]: I1202 09:00:00.954693 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w"] Dec 02 09:00:01 crc kubenswrapper[4895]: I1202 09:00:01.900648 4895 generic.go:334] "Generic (PLEG): container finished" podID="ac99d129-acbe-49c2-8d4f-964620886771" containerID="f5f5b477d01498157cb8d4547cb7c47971159e0b38a12aa11d8ffec5ec66a92c" exitCode=0 Dec 02 09:00:01 crc kubenswrapper[4895]: I1202 09:00:01.900715 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" event={"ID":"ac99d129-acbe-49c2-8d4f-964620886771","Type":"ContainerDied","Data":"f5f5b477d01498157cb8d4547cb7c47971159e0b38a12aa11d8ffec5ec66a92c"} Dec 02 09:00:01 crc kubenswrapper[4895]: I1202 09:00:01.901025 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" event={"ID":"ac99d129-acbe-49c2-8d4f-964620886771","Type":"ContainerStarted","Data":"84eaa1f425621d5fa9656a394764fd05a4d39ca86119736eb6bb911b37669de0"} Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.141490 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:00:03 crc kubenswrapper[4895]: E1202 09:00:03.141825 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.245930 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.332502 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6mb\" (UniqueName: \"kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb\") pod \"ac99d129-acbe-49c2-8d4f-964620886771\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.332576 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume\") pod \"ac99d129-acbe-49c2-8d4f-964620886771\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.332620 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume\") pod \"ac99d129-acbe-49c2-8d4f-964620886771\" (UID: \"ac99d129-acbe-49c2-8d4f-964620886771\") " Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.333604 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume" (OuterVolumeSpecName: "config-volume") pod "ac99d129-acbe-49c2-8d4f-964620886771" (UID: "ac99d129-acbe-49c2-8d4f-964620886771"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.338352 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb" (OuterVolumeSpecName: "kube-api-access-2h6mb") pod "ac99d129-acbe-49c2-8d4f-964620886771" (UID: "ac99d129-acbe-49c2-8d4f-964620886771"). InnerVolumeSpecName "kube-api-access-2h6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.338425 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ac99d129-acbe-49c2-8d4f-964620886771" (UID: "ac99d129-acbe-49c2-8d4f-964620886771"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.435032 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6mb\" (UniqueName: \"kubernetes.io/projected/ac99d129-acbe-49c2-8d4f-964620886771-kube-api-access-2h6mb\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.435075 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ac99d129-acbe-49c2-8d4f-964620886771-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.435096 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac99d129-acbe-49c2-8d4f-964620886771-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.922052 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" event={"ID":"ac99d129-acbe-49c2-8d4f-964620886771","Type":"ContainerDied","Data":"84eaa1f425621d5fa9656a394764fd05a4d39ca86119736eb6bb911b37669de0"} Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.922120 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w" Dec 02 09:00:03 crc kubenswrapper[4895]: I1202 09:00:03.922129 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84eaa1f425621d5fa9656a394764fd05a4d39ca86119736eb6bb911b37669de0" Dec 02 09:00:04 crc kubenswrapper[4895]: I1202 09:00:04.334007 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524"] Dec 02 09:00:04 crc kubenswrapper[4895]: I1202 09:00:04.341549 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-sf524"] Dec 02 09:00:05 crc kubenswrapper[4895]: I1202 09:00:05.153104 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d428c26-837d-4f01-ba20-08bee0b90363" path="/var/lib/kubelet/pods/7d428c26-837d-4f01-ba20-08bee0b90363/volumes" Dec 02 09:00:18 crc kubenswrapper[4895]: I1202 09:00:18.141601 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:00:18 crc kubenswrapper[4895]: E1202 09:00:18.142982 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:00:29 crc kubenswrapper[4895]: I1202 09:00:29.149463 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:00:29 crc kubenswrapper[4895]: E1202 09:00:29.150534 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:00:30 crc kubenswrapper[4895]: I1202 09:00:30.408831 4895 scope.go:117] "RemoveContainer" containerID="d300717031a0462a0984129bb75e3958ec0bb4646df2606fb9028a0d706051b5" Dec 02 09:00:43 crc kubenswrapper[4895]: I1202 09:00:43.281950 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:00:43 crc kubenswrapper[4895]: E1202 09:00:43.283977 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:00:58 crc kubenswrapper[4895]: I1202 09:00:58.143110 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:00:58 crc kubenswrapper[4895]: E1202 09:00:58.144395 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.151056 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411101-7q6kv"] Dec 02 09:01:00 crc kubenswrapper[4895]: E1202 09:01:00.151871 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac99d129-acbe-49c2-8d4f-964620886771" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.151890 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac99d129-acbe-49c2-8d4f-964620886771" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.152119 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac99d129-acbe-49c2-8d4f-964620886771" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.152977 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.169092 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411101-7q6kv"] Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.330453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.330873 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqms\" (UniqueName: \"kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.331007 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.331176 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.433195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.433266 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.433476 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.433518 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqms\" (UniqueName: \"kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.439920 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.440766 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.445130 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.451061 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqms\" (UniqueName: \"kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms\") pod \"keystone-cron-29411101-7q6kv\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.481232 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:00 crc kubenswrapper[4895]: I1202 09:01:00.917623 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411101-7q6kv"] Dec 02 09:01:01 crc kubenswrapper[4895]: I1202 09:01:01.478462 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-7q6kv" event={"ID":"7148462c-5f1b-4d1f-a161-5ffbf7963add","Type":"ContainerStarted","Data":"a93c72db35163f808520ebd23a1a2afcd942930ba2c36bd9ef4203c8491e828e"} Dec 02 09:01:01 crc kubenswrapper[4895]: I1202 09:01:01.478946 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-7q6kv" event={"ID":"7148462c-5f1b-4d1f-a161-5ffbf7963add","Type":"ContainerStarted","Data":"e0a3d2ff78bad0c8752f9389519bb748a67388cfae1d6d25c00d46f55f23d834"} Dec 02 09:01:01 crc kubenswrapper[4895]: I1202 09:01:01.499618 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411101-7q6kv" podStartSLOduration=1.4995933049999999 podStartE2EDuration="1.499593305s" podCreationTimestamp="2025-12-02 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:01.492173423 +0000 UTC m=+5872.663033036" watchObservedRunningTime="2025-12-02 09:01:01.499593305 +0000 UTC m=+5872.670452918" Dec 02 09:01:03 crc kubenswrapper[4895]: I1202 09:01:03.503327 4895 generic.go:334] "Generic (PLEG): container finished" podID="7148462c-5f1b-4d1f-a161-5ffbf7963add" containerID="a93c72db35163f808520ebd23a1a2afcd942930ba2c36bd9ef4203c8491e828e" exitCode=0 Dec 02 09:01:03 crc kubenswrapper[4895]: I1202 09:01:03.503447 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-7q6kv" event={"ID":"7148462c-5f1b-4d1f-a161-5ffbf7963add","Type":"ContainerDied","Data":"a93c72db35163f808520ebd23a1a2afcd942930ba2c36bd9ef4203c8491e828e"} Dec 02 09:01:04 crc kubenswrapper[4895]: I1202 09:01:04.879464 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.029878 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqms\" (UniqueName: \"kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms\") pod \"7148462c-5f1b-4d1f-a161-5ffbf7963add\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.030007 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys\") pod \"7148462c-5f1b-4d1f-a161-5ffbf7963add\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.030114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle\") pod \"7148462c-5f1b-4d1f-a161-5ffbf7963add\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.030135 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data\") pod \"7148462c-5f1b-4d1f-a161-5ffbf7963add\" (UID: \"7148462c-5f1b-4d1f-a161-5ffbf7963add\") " Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.036551 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms" (OuterVolumeSpecName: "kube-api-access-gfqms") pod "7148462c-5f1b-4d1f-a161-5ffbf7963add" (UID: "7148462c-5f1b-4d1f-a161-5ffbf7963add"). InnerVolumeSpecName "kube-api-access-gfqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.037001 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7148462c-5f1b-4d1f-a161-5ffbf7963add" (UID: "7148462c-5f1b-4d1f-a161-5ffbf7963add"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.062444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7148462c-5f1b-4d1f-a161-5ffbf7963add" (UID: "7148462c-5f1b-4d1f-a161-5ffbf7963add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.106678 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data" (OuterVolumeSpecName: "config-data") pod "7148462c-5f1b-4d1f-a161-5ffbf7963add" (UID: "7148462c-5f1b-4d1f-a161-5ffbf7963add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.131977 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqms\" (UniqueName: \"kubernetes.io/projected/7148462c-5f1b-4d1f-a161-5ffbf7963add-kube-api-access-gfqms\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.132027 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.132037 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.132045 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7148462c-5f1b-4d1f-a161-5ffbf7963add-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.522643 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-7q6kv" event={"ID":"7148462c-5f1b-4d1f-a161-5ffbf7963add","Type":"ContainerDied","Data":"e0a3d2ff78bad0c8752f9389519bb748a67388cfae1d6d25c00d46f55f23d834"} Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.522994 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a3d2ff78bad0c8752f9389519bb748a67388cfae1d6d25c00d46f55f23d834" Dec 02 09:01:05 crc kubenswrapper[4895]: I1202 09:01:05.522721 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-7q6kv" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.735672 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:01:08 crc kubenswrapper[4895]: E1202 09:01:08.736681 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7148462c-5f1b-4d1f-a161-5ffbf7963add" containerName="keystone-cron" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.736693 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7148462c-5f1b-4d1f-a161-5ffbf7963add" containerName="keystone-cron" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.736934 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7148462c-5f1b-4d1f-a161-5ffbf7963add" containerName="keystone-cron" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.738249 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.756023 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.912578 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-746nv\" (UniqueName: \"kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.912627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:08 crc kubenswrapper[4895]: I1202 09:01:08.912853 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.014512 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-746nv\" (UniqueName: \"kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.014574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.014665 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.015299 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.015402 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.037125 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-746nv\" (UniqueName: \"kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv\") pod \"community-operators-9wbpq\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.060491 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.195211 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:01:09 crc kubenswrapper[4895]: E1202 09:01:09.196010 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:09 crc kubenswrapper[4895]: I1202 09:01:09.667370 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:01:10 crc kubenswrapper[4895]: I1202 09:01:10.566264 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerID="de3bbac5a74a5cc0810dd3da96379636acb4ad04b50cb2ee2965d0422034427e" exitCode=0 Dec 02 09:01:10 crc kubenswrapper[4895]: I1202 09:01:10.566383 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerDied","Data":"de3bbac5a74a5cc0810dd3da96379636acb4ad04b50cb2ee2965d0422034427e"} Dec 02 09:01:10 crc kubenswrapper[4895]: I1202 09:01:10.566588 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerStarted","Data":"3eae0340249f3ad15a5fc4ba9fdf300879a28280bdf2fbcf30c0f54352910b20"} Dec 02 09:01:15 crc kubenswrapper[4895]: I1202 09:01:15.621428 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerID="95c2ac2f7c36142d5657f3ec2b136e73b2a9689a9996befad56e0eb984388af0" exitCode=0 Dec 02 09:01:15 crc kubenswrapper[4895]: I1202 09:01:15.621531 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerDied","Data":"95c2ac2f7c36142d5657f3ec2b136e73b2a9689a9996befad56e0eb984388af0"} Dec 02 09:01:16 crc kubenswrapper[4895]: I1202 09:01:16.634578 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerStarted","Data":"00de8ed32c50f197fdac75aa57eedb83dae05e429968e44ebd0c0e2e19d986f7"} Dec 02 09:01:16 crc kubenswrapper[4895]: I1202 09:01:16.662366 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wbpq" podStartSLOduration=3.151066005 podStartE2EDuration="8.662332831s" podCreationTimestamp="2025-12-02 09:01:08 +0000 UTC" firstStartedPulling="2025-12-02 09:01:10.567600403 +0000 UTC m=+5881.738460016" lastFinishedPulling="2025-12-02 09:01:16.078867229 +0000 UTC m=+5887.249726842" observedRunningTime="2025-12-02 09:01:16.651653358 +0000 UTC m=+5887.822512971" watchObservedRunningTime="2025-12-02 09:01:16.662332831 +0000 UTC m=+5887.833192454" Dec 02 09:01:19 crc kubenswrapper[4895]: I1202 09:01:19.061599 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:19 crc kubenswrapper[4895]: I1202 09:01:19.062294 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:19 crc kubenswrapper[4895]: I1202 09:01:19.123222 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:20 crc kubenswrapper[4895]: I1202 09:01:20.067512 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-r6xf4"] Dec 02 09:01:20 crc kubenswrapper[4895]: I1202 09:01:20.077635 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5328-account-create-update-js8vj"] Dec 02 09:01:20 crc kubenswrapper[4895]: I1202 09:01:20.086766 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5328-account-create-update-js8vj"] Dec 02 09:01:20 crc kubenswrapper[4895]: I1202 09:01:20.094693 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-r6xf4"] Dec 02 09:01:20 crc kubenswrapper[4895]: I1202 09:01:20.140996 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:01:20 crc kubenswrapper[4895]: E1202 09:01:20.141237 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:21 crc kubenswrapper[4895]: I1202 09:01:21.153627 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2" path="/var/lib/kubelet/pods/00e12a0b-dbf2-4e96-ae0e-b6c71cdd9ca2/volumes" Dec 02 09:01:21 crc kubenswrapper[4895]: I1202 09:01:21.156309 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7ae511-f6f1-4369-92ef-2314d47f0bc9" path="/var/lib/kubelet/pods/dc7ae511-f6f1-4369-92ef-2314d47f0bc9/volumes" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.909217 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fp6lz"] Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.910973 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.915439 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-f9dhr" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.915636 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.941804 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz"] Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.981185 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kgxnn"] Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.983253 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.994382 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6389707d-0e93-4457-ae41-4da59350383e-scripts\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.994496 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhl8\" (UniqueName: \"kubernetes.io/projected/6389707d-0e93-4457-ae41-4da59350383e-kube-api-access-sjhl8\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.994630 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.994666 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.994764 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-log-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:22 crc kubenswrapper[4895]: I1202 09:01:22.996144 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kgxnn"] Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.096701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-log-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097359 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-log-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097493 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-log\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097523 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6389707d-0e93-4457-ae41-4da59350383e-scripts\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097548 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-run\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5ct8\" (UniqueName: \"kubernetes.io/projected/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-kube-api-access-c5ct8\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097617 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-etc-ovs\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097635 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-scripts\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhl8\" (UniqueName: \"kubernetes.io/projected/6389707d-0e93-4457-ae41-4da59350383e-kube-api-access-sjhl8\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097706 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-lib\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097760 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097798 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.097911 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.099347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6389707d-0e93-4457-ae41-4da59350383e-var-run-ovn\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.100953 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6389707d-0e93-4457-ae41-4da59350383e-scripts\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.122510 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhl8\" (UniqueName: \"kubernetes.io/projected/6389707d-0e93-4457-ae41-4da59350383e-kube-api-access-sjhl8\") pod \"ovn-controller-fp6lz\" (UID: \"6389707d-0e93-4457-ae41-4da59350383e\") " pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199525 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-log\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199606 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-run\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199638 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5ct8\" (UniqueName: \"kubernetes.io/projected/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-kube-api-access-c5ct8\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199728 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-etc-ovs\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-scripts\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.199900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-lib\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.200093 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-lib\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.201305 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-log\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.201376 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-etc-ovs\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.201559 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-var-run\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.204021 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-scripts\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.221875 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5ct8\" (UniqueName: \"kubernetes.io/projected/2b3fbe76-b9cc-402f-9f1b-46d64c057d31-kube-api-access-c5ct8\") pod \"ovn-controller-ovs-kgxnn\" (UID: \"2b3fbe76-b9cc-402f-9f1b-46d64c057d31\") " pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.265284 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.359868 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:23 crc kubenswrapper[4895]: I1202 09:01:23.968481 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz"] Dec 02 09:01:23 crc kubenswrapper[4895]: W1202 09:01:23.973805 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6389707d_0e93_4457_ae41_4da59350383e.slice/crio-af1ef3152a496908296a8f71f29c9c8e282c874418cd7a15ec0c4d3a230da965 WatchSource:0}: Error finding container af1ef3152a496908296a8f71f29c9c8e282c874418cd7a15ec0c4d3a230da965: Status 404 returned error can't find the container with id af1ef3152a496908296a8f71f29c9c8e282c874418cd7a15ec0c4d3a230da965 Dec 02 09:01:24 crc kubenswrapper[4895]: I1202 09:01:24.514209 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kgxnn"] Dec 02 09:01:24 crc kubenswrapper[4895]: W1202 09:01:24.517955 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b3fbe76_b9cc_402f_9f1b_46d64c057d31.slice/crio-3a966e4b4e8390141bd1bebe35dcf9eafa5932bec776948427bea7ae0ff71b93 WatchSource:0}: Error finding container 3a966e4b4e8390141bd1bebe35dcf9eafa5932bec776948427bea7ae0ff71b93: Status 404 returned error can't find the container with id 3a966e4b4e8390141bd1bebe35dcf9eafa5932bec776948427bea7ae0ff71b93 Dec 02 09:01:24 crc kubenswrapper[4895]: I1202 09:01:24.722867 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz" event={"ID":"6389707d-0e93-4457-ae41-4da59350383e","Type":"ContainerStarted","Data":"feee3b870e14cfb3edb2f36354ad6b035462ad666f1deb9c3f0fffab0902bf24"} Dec 02 09:01:24 crc kubenswrapper[4895]: I1202 09:01:24.723272 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz" event={"ID":"6389707d-0e93-4457-ae41-4da59350383e","Type":"ContainerStarted","Data":"af1ef3152a496908296a8f71f29c9c8e282c874418cd7a15ec0c4d3a230da965"} Dec 02 09:01:24 crc kubenswrapper[4895]: I1202 09:01:24.728803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kgxnn" event={"ID":"2b3fbe76-b9cc-402f-9f1b-46d64c057d31","Type":"ContainerStarted","Data":"3a966e4b4e8390141bd1bebe35dcf9eafa5932bec776948427bea7ae0ff71b93"} Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.690211 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-dcddm"] Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.692156 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.694365 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.724752 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dcddm"] Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.749324 4895 generic.go:334] "Generic (PLEG): container finished" podID="2b3fbe76-b9cc-402f-9f1b-46d64c057d31" containerID="9c66630bf4e7a3dffe67fa895926febc481f4606f623a629cee067e9dea295cc" exitCode=0 Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.749563 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kgxnn" event={"ID":"2b3fbe76-b9cc-402f-9f1b-46d64c057d31","Type":"ContainerDied","Data":"9c66630bf4e7a3dffe67fa895926febc481f4606f623a629cee067e9dea295cc"} Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.749617 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fp6lz" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.770160 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8d0991-333c-44a2-a646-14665184fb94-config\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.770245 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovs-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.770363 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovn-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.770407 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqg9l\" (UniqueName: \"kubernetes.io/projected/eb8d0991-333c-44a2-a646-14665184fb94-kube-api-access-xqg9l\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.796887 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fp6lz" podStartSLOduration=3.796863998 podStartE2EDuration="3.796863998s" podCreationTimestamp="2025-12-02 09:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:25.787784625 +0000 UTC m=+5896.958644248" watchObservedRunningTime="2025-12-02 09:01:25.796863998 +0000 UTC m=+5896.967723611" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.873176 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovs-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.873379 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovn-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.873426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqg9l\" (UniqueName: \"kubernetes.io/projected/eb8d0991-333c-44a2-a646-14665184fb94-kube-api-access-xqg9l\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.873509 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8d0991-333c-44a2-a646-14665184fb94-config\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.873641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovs-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.874457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb8d0991-333c-44a2-a646-14665184fb94-config\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.874463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eb8d0991-333c-44a2-a646-14665184fb94-ovn-rundir\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:25 crc kubenswrapper[4895]: I1202 09:01:25.903664 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqg9l\" (UniqueName: \"kubernetes.io/projected/eb8d0991-333c-44a2-a646-14665184fb94-kube-api-access-xqg9l\") pod \"ovn-controller-metrics-dcddm\" (UID: \"eb8d0991-333c-44a2-a646-14665184fb94\") " pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.020871 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dcddm" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.053376 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-scqpw"] Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.067997 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-scqpw"] Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.139700 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-8kqxg"] Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.141269 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.171013 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-8kqxg"] Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.281327 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssfw2\" (UniqueName: \"kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.281403 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.383372 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssfw2\" (UniqueName: \"kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.383489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.384712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.409255 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssfw2\" (UniqueName: \"kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2\") pod \"octavia-db-create-8kqxg\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.485142 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.596787 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dcddm"] Dec 02 09:01:26 crc kubenswrapper[4895]: W1202 09:01:26.634778 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8d0991_333c_44a2_a646_14665184fb94.slice/crio-b0ada1986133e3a2e053ec92318490fdfa0a1e7d6ccc2d0477e0c651ac72d583 WatchSource:0}: Error finding container b0ada1986133e3a2e053ec92318490fdfa0a1e7d6ccc2d0477e0c651ac72d583: Status 404 returned error can't find the container with id b0ada1986133e3a2e053ec92318490fdfa0a1e7d6ccc2d0477e0c651ac72d583 Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.775250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kgxnn" event={"ID":"2b3fbe76-b9cc-402f-9f1b-46d64c057d31","Type":"ContainerStarted","Data":"09606346f93e4d752dc16c1da61d2afc04fac441cf53efa9ab5a138eae01f979"} Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.775606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kgxnn" event={"ID":"2b3fbe76-b9cc-402f-9f1b-46d64c057d31","Type":"ContainerStarted","Data":"f806b38464992bdfc7e3b8266f5cc1dbcc2edc3f7943e67d04c7585f25b6cd08"} Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.775653 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.775709 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.777225 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dcddm" event={"ID":"eb8d0991-333c-44a2-a646-14665184fb94","Type":"ContainerStarted","Data":"b0ada1986133e3a2e053ec92318490fdfa0a1e7d6ccc2d0477e0c651ac72d583"} Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.803093 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kgxnn" podStartSLOduration=4.8030731410000005 podStartE2EDuration="4.803073141s" podCreationTimestamp="2025-12-02 09:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:26.796990722 +0000 UTC m=+5897.967850335" watchObservedRunningTime="2025-12-02 09:01:26.803073141 +0000 UTC m=+5897.973932754" Dec 02 09:01:26 crc kubenswrapper[4895]: I1202 09:01:26.975452 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-8kqxg"] Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.165528 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6dd260-7a03-4151-a57a-ed65069db068" path="/var/lib/kubelet/pods/9a6dd260-7a03-4151-a57a-ed65069db068/volumes" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.615838 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-9e34-account-create-update-5c4v7"] Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.618327 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.621034 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.628366 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9e34-account-create-update-5c4v7"] Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.718173 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.718262 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26dbz\" (UniqueName: \"kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.789700 4895 generic.go:334] "Generic (PLEG): container finished" podID="3b86914c-f3af-4d62-b57a-9b94de461aea" containerID="b0ed2dbede21c00b312cadb52185016964e796c0330238ad8137e24d9c5f7df2" exitCode=0 Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.789865 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8kqxg" event={"ID":"3b86914c-f3af-4d62-b57a-9b94de461aea","Type":"ContainerDied","Data":"b0ed2dbede21c00b312cadb52185016964e796c0330238ad8137e24d9c5f7df2"} Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.789908 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8kqxg" event={"ID":"3b86914c-f3af-4d62-b57a-9b94de461aea","Type":"ContainerStarted","Data":"22795952a1ccafcf4fad59b3fb59998fa28085486e2084ae5151533610f3e425"} Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.793848 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dcddm" event={"ID":"eb8d0991-333c-44a2-a646-14665184fb94","Type":"ContainerStarted","Data":"a4fe2a6c09dbeadcf8bd0c7f1392abd2989e5b66ab35f923da0a6c8a81beec34"} Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.820110 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.820203 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26dbz\" (UniqueName: \"kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.821173 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.840765 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-dcddm" podStartSLOduration=2.840717123 podStartE2EDuration="2.840717123s" podCreationTimestamp="2025-12-02 09:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:27.83516935 +0000 UTC m=+5899.006028963" watchObservedRunningTime="2025-12-02 09:01:27.840717123 +0000 UTC m=+5899.011576746" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.854441 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26dbz\" (UniqueName: \"kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz\") pod \"octavia-9e34-account-create-update-5c4v7\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:27 crc kubenswrapper[4895]: I1202 09:01:27.941941 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:28 crc kubenswrapper[4895]: I1202 09:01:28.443227 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9e34-account-create-update-5c4v7"] Dec 02 09:01:28 crc kubenswrapper[4895]: I1202 09:01:28.802652 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9e34-account-create-update-5c4v7" event={"ID":"a4774a86-a00e-4910-91eb-34fb42e710ef","Type":"ContainerStarted","Data":"285b94e07574b2f07f23ebbcd8f4389c2d205eb23d9b7f75fb2e44410260b53a"} Dec 02 09:01:28 crc kubenswrapper[4895]: I1202 09:01:28.802705 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9e34-account-create-update-5c4v7" event={"ID":"a4774a86-a00e-4910-91eb-34fb42e710ef","Type":"ContainerStarted","Data":"1a03b5752db57ab9a72e9b6a1dc96a0eb1f097c5ad5241b341600c2a71e39796"} Dec 02 09:01:28 crc kubenswrapper[4895]: I1202 09:01:28.831151 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-9e34-account-create-update-5c4v7" podStartSLOduration=1.8311216940000001 podStartE2EDuration="1.831121694s" podCreationTimestamp="2025-12-02 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:28.823662192 +0000 UTC m=+5899.994521805" watchObservedRunningTime="2025-12-02 09:01:28.831121694 +0000 UTC m=+5900.001981307" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.129083 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.146938 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.244279 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.260448 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts\") pod \"3b86914c-f3af-4d62-b57a-9b94de461aea\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.260763 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssfw2\" (UniqueName: \"kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2\") pod \"3b86914c-f3af-4d62-b57a-9b94de461aea\" (UID: \"3b86914c-f3af-4d62-b57a-9b94de461aea\") " Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.261390 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b86914c-f3af-4d62-b57a-9b94de461aea" (UID: "3b86914c-f3af-4d62-b57a-9b94de461aea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.261477 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b86914c-f3af-4d62-b57a-9b94de461aea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.289854 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.290161 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dllp7" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="registry-server" containerID="cri-o://c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485" gracePeriod=2 Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.297667 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2" (OuterVolumeSpecName: "kube-api-access-ssfw2") pod "3b86914c-f3af-4d62-b57a-9b94de461aea" (UID: "3b86914c-f3af-4d62-b57a-9b94de461aea"). InnerVolumeSpecName "kube-api-access-ssfw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.365442 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssfw2\" (UniqueName: \"kubernetes.io/projected/3b86914c-f3af-4d62-b57a-9b94de461aea-kube-api-access-ssfw2\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.756303 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dllp7" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.830463 4895 generic.go:334] "Generic (PLEG): container finished" podID="880f2543-f0c5-4665-b094-13baea7fbf31" containerID="c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485" exitCode=0 Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.830549 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerDied","Data":"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485"} Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.830576 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dllp7" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.830627 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dllp7" event={"ID":"880f2543-f0c5-4665-b094-13baea7fbf31","Type":"ContainerDied","Data":"d8050c9161990da5fc241c1326b0e8c2101502c29f60891c952b7ceeb1bd37bf"} Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.830661 4895 scope.go:117] "RemoveContainer" containerID="c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.834990 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4774a86-a00e-4910-91eb-34fb42e710ef" containerID="285b94e07574b2f07f23ebbcd8f4389c2d205eb23d9b7f75fb2e44410260b53a" exitCode=0 Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.835087 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9e34-account-create-update-5c4v7" event={"ID":"a4774a86-a00e-4910-91eb-34fb42e710ef","Type":"ContainerDied","Data":"285b94e07574b2f07f23ebbcd8f4389c2d205eb23d9b7f75fb2e44410260b53a"} Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.854896 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8kqxg" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.854977 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8kqxg" event={"ID":"3b86914c-f3af-4d62-b57a-9b94de461aea","Type":"ContainerDied","Data":"22795952a1ccafcf4fad59b3fb59998fa28085486e2084ae5151533610f3e425"} Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.855038 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22795952a1ccafcf4fad59b3fb59998fa28085486e2084ae5151533610f3e425" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.889998 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content\") pod \"880f2543-f0c5-4665-b094-13baea7fbf31\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.890240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities\") pod \"880f2543-f0c5-4665-b094-13baea7fbf31\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.890390 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z\") pod \"880f2543-f0c5-4665-b094-13baea7fbf31\" (UID: \"880f2543-f0c5-4665-b094-13baea7fbf31\") " Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.900278 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z" (OuterVolumeSpecName: "kube-api-access-q6t6z") pod "880f2543-f0c5-4665-b094-13baea7fbf31" (UID: "880f2543-f0c5-4665-b094-13baea7fbf31"). InnerVolumeSpecName "kube-api-access-q6t6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.900570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities" (OuterVolumeSpecName: "utilities") pod "880f2543-f0c5-4665-b094-13baea7fbf31" (UID: "880f2543-f0c5-4665-b094-13baea7fbf31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.916072 4895 scope.go:117] "RemoveContainer" containerID="9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.960933 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "880f2543-f0c5-4665-b094-13baea7fbf31" (UID: "880f2543-f0c5-4665-b094-13baea7fbf31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.974394 4895 scope.go:117] "RemoveContainer" containerID="3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.993701 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.993755 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/880f2543-f0c5-4665-b094-13baea7fbf31-kube-api-access-q6t6z\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:29 crc kubenswrapper[4895]: I1202 09:01:29.993769 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/880f2543-f0c5-4665-b094-13baea7fbf31-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.004289 4895 scope.go:117] "RemoveContainer" containerID="c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485" Dec 02 09:01:30 crc kubenswrapper[4895]: E1202 09:01:30.005248 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485\": container with ID starting with c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485 not found: ID does not exist" containerID="c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.005317 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485"} err="failed to get container status \"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485\": rpc error: code = NotFound desc = could not find container \"c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485\": container with ID starting with c411a513681f3a6ae8540e0230e3d2c813dabd28c4e1ad583010175266e4f485 not found: ID does not exist" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.005360 4895 scope.go:117] "RemoveContainer" containerID="9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5" Dec 02 09:01:30 crc kubenswrapper[4895]: E1202 09:01:30.008926 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5\": container with ID starting with 9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5 not found: ID does not exist" containerID="9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.008991 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5"} err="failed to get container status \"9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5\": rpc error: code = NotFound desc = could not find container \"9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5\": container with ID starting with 9780216855c7f3d26d315f9e9902505932d2d17691005c0a1b7e68ea287121f5 not found: ID does not exist" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.009023 4895 scope.go:117] "RemoveContainer" containerID="3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb" Dec 02 09:01:30 crc kubenswrapper[4895]: E1202 09:01:30.009629 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb\": container with ID starting with 3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb not found: ID does not exist" containerID="3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.009680 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb"} err="failed to get container status \"3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb\": rpc error: code = NotFound desc = could not find container \"3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb\": container with ID starting with 3c19419cbcfc0954e6661d9f29aa9e855dbe34a664bb95767679a8cbd8f65ebb not found: ID does not exist" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.171950 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.185271 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dllp7"] Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.468093 4895 scope.go:117] "RemoveContainer" containerID="2dbf99bfd146113519ce737354e8a75a92110def9bd511e5dca1d8d1ae7e3434" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.498122 4895 scope.go:117] "RemoveContainer" containerID="f87ceeedd521a992123bb2260bb4ac98491bd716dbc831a0b8357008ccf19d6b" Dec 02 09:01:30 crc kubenswrapper[4895]: I1202 09:01:30.542559 4895 scope.go:117] "RemoveContainer" containerID="8af73bd21f4459f704ff31f24c06903e8ad8671a2957bf5bcb32c908b4f977db" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.166422 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" path="/var/lib/kubelet/pods/880f2543-f0c5-4665-b094-13baea7fbf31/volumes" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.346443 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.424678 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts\") pod \"a4774a86-a00e-4910-91eb-34fb42e710ef\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.424891 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26dbz\" (UniqueName: \"kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz\") pod \"a4774a86-a00e-4910-91eb-34fb42e710ef\" (UID: \"a4774a86-a00e-4910-91eb-34fb42e710ef\") " Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.426021 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4774a86-a00e-4910-91eb-34fb42e710ef" (UID: "a4774a86-a00e-4910-91eb-34fb42e710ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.445043 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz" (OuterVolumeSpecName: "kube-api-access-26dbz") pod "a4774a86-a00e-4910-91eb-34fb42e710ef" (UID: "a4774a86-a00e-4910-91eb-34fb42e710ef"). InnerVolumeSpecName "kube-api-access-26dbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.528446 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4774a86-a00e-4910-91eb-34fb42e710ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.528527 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26dbz\" (UniqueName: \"kubernetes.io/projected/a4774a86-a00e-4910-91eb-34fb42e710ef-kube-api-access-26dbz\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.883411 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9e34-account-create-update-5c4v7" event={"ID":"a4774a86-a00e-4910-91eb-34fb42e710ef","Type":"ContainerDied","Data":"1a03b5752db57ab9a72e9b6a1dc96a0eb1f097c5ad5241b341600c2a71e39796"} Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.883461 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a03b5752db57ab9a72e9b6a1dc96a0eb1f097c5ad5241b341600c2a71e39796" Dec 02 09:01:31 crc kubenswrapper[4895]: I1202 09:01:31.883532 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9e34-account-create-update-5c4v7" Dec 02 09:01:32 crc kubenswrapper[4895]: I1202 09:01:32.141415 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:01:32 crc kubenswrapper[4895]: E1202 09:01:32.142204 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.153557 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-jwddv"] Dec 02 09:01:34 crc kubenswrapper[4895]: E1202 09:01:34.154480 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b86914c-f3af-4d62-b57a-9b94de461aea" containerName="mariadb-database-create" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154492 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b86914c-f3af-4d62-b57a-9b94de461aea" containerName="mariadb-database-create" Dec 02 09:01:34 crc kubenswrapper[4895]: E1202 09:01:34.154508 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4774a86-a00e-4910-91eb-34fb42e710ef" containerName="mariadb-account-create-update" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154514 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4774a86-a00e-4910-91eb-34fb42e710ef" containerName="mariadb-account-create-update" Dec 02 09:01:34 crc kubenswrapper[4895]: E1202 09:01:34.154544 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="extract-content" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154551 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="extract-content" Dec 02 09:01:34 crc kubenswrapper[4895]: E1202 09:01:34.154561 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="registry-server" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154569 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="registry-server" Dec 02 09:01:34 crc kubenswrapper[4895]: E1202 09:01:34.154583 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="extract-utilities" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154589 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="extract-utilities" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154819 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="880f2543-f0c5-4665-b094-13baea7fbf31" containerName="registry-server" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154831 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4774a86-a00e-4910-91eb-34fb42e710ef" containerName="mariadb-account-create-update" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.154849 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b86914c-f3af-4d62-b57a-9b94de461aea" containerName="mariadb-database-create" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.155691 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.181565 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jwddv"] Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.337325 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.338039 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmv26\" (UniqueName: \"kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.440316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.440412 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmv26\" (UniqueName: \"kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.441266 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.465902 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmv26\" (UniqueName: \"kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26\") pod \"octavia-persistence-db-create-jwddv\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.473147 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:34 crc kubenswrapper[4895]: I1202 09:01:34.939535 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jwddv"] Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.157521 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c481-account-create-update-dgk2p"] Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.159215 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.163840 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c481-account-create-update-dgk2p"] Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.165399 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.260241 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5h4z\" (UniqueName: \"kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.260310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.362254 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.362437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5h4z\" (UniqueName: \"kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.363154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.388479 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5h4z\" (UniqueName: \"kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z\") pod \"octavia-c481-account-create-update-dgk2p\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.503773 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.937147 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bd2ca4e-e47c-49aa-827f-4ecf5760939e" containerID="8550221e31485856ef3de33c5649eecefe408e448419218a943a0f463d043ec4" exitCode=0 Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.937425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jwddv" event={"ID":"2bd2ca4e-e47c-49aa-827f-4ecf5760939e","Type":"ContainerDied","Data":"8550221e31485856ef3de33c5649eecefe408e448419218a943a0f463d043ec4"} Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.937541 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jwddv" event={"ID":"2bd2ca4e-e47c-49aa-827f-4ecf5760939e","Type":"ContainerStarted","Data":"fe9a8fdb9a8e9ee166f505c4a3eba9ba94d430207d8fd8e44f58963977395ffd"} Dec 02 09:01:35 crc kubenswrapper[4895]: I1202 09:01:35.991593 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c481-account-create-update-dgk2p"] Dec 02 09:01:35 crc kubenswrapper[4895]: W1202 09:01:35.998693 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0441a95_f89f_481e_834d_f508174764ae.slice/crio-1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7 WatchSource:0}: Error finding container 1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7: Status 404 returned error can't find the container with id 1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7 Dec 02 09:01:36 crc kubenswrapper[4895]: I1202 09:01:36.954436 4895 generic.go:334] "Generic (PLEG): container finished" podID="c0441a95-f89f-481e-834d-f508174764ae" containerID="f048d3a2d0cd2fc93ea28d7c1aeba014683e8b1518540c2cf6daa121b589fb64" exitCode=0 Dec 02 09:01:36 crc kubenswrapper[4895]: I1202 09:01:36.954541 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c481-account-create-update-dgk2p" event={"ID":"c0441a95-f89f-481e-834d-f508174764ae","Type":"ContainerDied","Data":"f048d3a2d0cd2fc93ea28d7c1aeba014683e8b1518540c2cf6daa121b589fb64"} Dec 02 09:01:36 crc kubenswrapper[4895]: I1202 09:01:36.955169 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c481-account-create-update-dgk2p" event={"ID":"c0441a95-f89f-481e-834d-f508174764ae","Type":"ContainerStarted","Data":"1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7"} Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.334407 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.505189 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmv26\" (UniqueName: \"kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26\") pod \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.505290 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts\") pod \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\" (UID: \"2bd2ca4e-e47c-49aa-827f-4ecf5760939e\") " Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.506342 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bd2ca4e-e47c-49aa-827f-4ecf5760939e" (UID: "2bd2ca4e-e47c-49aa-827f-4ecf5760939e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.511465 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26" (OuterVolumeSpecName: "kube-api-access-wmv26") pod "2bd2ca4e-e47c-49aa-827f-4ecf5760939e" (UID: "2bd2ca4e-e47c-49aa-827f-4ecf5760939e"). InnerVolumeSpecName "kube-api-access-wmv26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.609591 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmv26\" (UniqueName: \"kubernetes.io/projected/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-kube-api-access-wmv26\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.609951 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd2ca4e-e47c-49aa-827f-4ecf5760939e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.967485 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jwddv" event={"ID":"2bd2ca4e-e47c-49aa-827f-4ecf5760939e","Type":"ContainerDied","Data":"fe9a8fdb9a8e9ee166f505c4a3eba9ba94d430207d8fd8e44f58963977395ffd"} Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.968869 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe9a8fdb9a8e9ee166f505c4a3eba9ba94d430207d8fd8e44f58963977395ffd" Dec 02 09:01:37 crc kubenswrapper[4895]: I1202 09:01:37.967602 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jwddv" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.347909 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.533709 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts\") pod \"c0441a95-f89f-481e-834d-f508174764ae\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.533831 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5h4z\" (UniqueName: \"kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z\") pod \"c0441a95-f89f-481e-834d-f508174764ae\" (UID: \"c0441a95-f89f-481e-834d-f508174764ae\") " Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.534518 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0441a95-f89f-481e-834d-f508174764ae" (UID: "c0441a95-f89f-481e-834d-f508174764ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.539701 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z" (OuterVolumeSpecName: "kube-api-access-h5h4z") pod "c0441a95-f89f-481e-834d-f508174764ae" (UID: "c0441a95-f89f-481e-834d-f508174764ae"). InnerVolumeSpecName "kube-api-access-h5h4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.636881 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0441a95-f89f-481e-834d-f508174764ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.636923 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5h4z\" (UniqueName: \"kubernetes.io/projected/c0441a95-f89f-481e-834d-f508174764ae-kube-api-access-h5h4z\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.979333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c481-account-create-update-dgk2p" event={"ID":"c0441a95-f89f-481e-834d-f508174764ae","Type":"ContainerDied","Data":"1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7"} Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.979381 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1470fb18769c7d46bc91523bcbbaa2c5da6e525983a13c5d21f08bb67ed85ad7" Dec 02 09:01:38 crc kubenswrapper[4895]: I1202 09:01:38.979449 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c481-account-create-update-dgk2p" Dec 02 09:01:39 crc kubenswrapper[4895]: I1202 09:01:39.045295 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4fwjz"] Dec 02 09:01:39 crc kubenswrapper[4895]: I1202 09:01:39.055273 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4fwjz"] Dec 02 09:01:39 crc kubenswrapper[4895]: I1202 09:01:39.156830 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f55786e-cf8a-4ce7-affc-1952b6a2e1ad" path="/var/lib/kubelet/pods/5f55786e-cf8a-4ce7-affc-1952b6a2e1ad/volumes" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.920002 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-7b55b799f8-g962x"] Dec 02 09:01:40 crc kubenswrapper[4895]: E1202 09:01:40.922267 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd2ca4e-e47c-49aa-827f-4ecf5760939e" containerName="mariadb-database-create" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.922289 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd2ca4e-e47c-49aa-827f-4ecf5760939e" containerName="mariadb-database-create" Dec 02 09:01:40 crc kubenswrapper[4895]: E1202 09:01:40.922331 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0441a95-f89f-481e-834d-f508174764ae" containerName="mariadb-account-create-update" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.922339 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0441a95-f89f-481e-834d-f508174764ae" containerName="mariadb-account-create-update" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.922543 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd2ca4e-e47c-49aa-827f-4ecf5760939e" containerName="mariadb-database-create" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.922567 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0441a95-f89f-481e-834d-f508174764ae" containerName="mariadb-account-create-update" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.924346 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.927493 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.927679 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.937753 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7b55b799f8-g962x"] Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.942120 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-dnnlc" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.989691 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-octavia-run\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.989949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.990181 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-scripts\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.990414 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data-merged\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:40 crc kubenswrapper[4895]: I1202 09:01:40.990715 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-combined-ca-bundle\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.093813 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.093910 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-scripts\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.093947 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data-merged\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.094008 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-combined-ca-bundle\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.094093 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-octavia-run\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.094685 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data-merged\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.094868 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8cccbb55-a2bc-4b2a-af20-1987d11430f0-octavia-run\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.101936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-scripts\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.102132 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-config-data\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.102519 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cccbb55-a2bc-4b2a-af20-1987d11430f0-combined-ca-bundle\") pod \"octavia-api-7b55b799f8-g962x\" (UID: \"8cccbb55-a2bc-4b2a-af20-1987d11430f0\") " pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.252462 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:41 crc kubenswrapper[4895]: I1202 09:01:41.888865 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-7b55b799f8-g962x"] Dec 02 09:01:42 crc kubenswrapper[4895]: I1202 09:01:42.032398 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7b55b799f8-g962x" event={"ID":"8cccbb55-a2bc-4b2a-af20-1987d11430f0","Type":"ContainerStarted","Data":"8680ecfce4541d735ebb2b3858226ba97826071b03682e2db02c42861f4e86ea"} Dec 02 09:01:47 crc kubenswrapper[4895]: I1202 09:01:47.142121 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:01:47 crc kubenswrapper[4895]: E1202 09:01:47.143291 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:53 crc kubenswrapper[4895]: I1202 09:01:53.150944 4895 generic.go:334] "Generic (PLEG): container finished" podID="8cccbb55-a2bc-4b2a-af20-1987d11430f0" containerID="c12de083684b14464a5b0b853d6757f678577191cb12ce593e8b0e79aebeb922" exitCode=0 Dec 02 09:01:53 crc kubenswrapper[4895]: I1202 09:01:53.163650 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7b55b799f8-g962x" event={"ID":"8cccbb55-a2bc-4b2a-af20-1987d11430f0","Type":"ContainerDied","Data":"c12de083684b14464a5b0b853d6757f678577191cb12ce593e8b0e79aebeb922"} Dec 02 09:01:54 crc kubenswrapper[4895]: I1202 09:01:54.176278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7b55b799f8-g962x" event={"ID":"8cccbb55-a2bc-4b2a-af20-1987d11430f0","Type":"ContainerStarted","Data":"925c40dc84a7dfbcd9c66d39f97ec029ba4652be1de39078ac77225c1d17f0d3"} Dec 02 09:01:54 crc kubenswrapper[4895]: I1202 09:01:54.176747 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-7b55b799f8-g962x" event={"ID":"8cccbb55-a2bc-4b2a-af20-1987d11430f0","Type":"ContainerStarted","Data":"193ee367311160567b5841f29c8765425f70b297b43edd29dfbb071e25028f28"} Dec 02 09:01:54 crc kubenswrapper[4895]: I1202 09:01:54.177952 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:54 crc kubenswrapper[4895]: I1202 09:01:54.177982 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:01:54 crc kubenswrapper[4895]: I1202 09:01:54.221265 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-7b55b799f8-g962x" podStartSLOduration=4.105935226 podStartE2EDuration="14.221231316s" podCreationTimestamp="2025-12-02 09:01:40 +0000 UTC" firstStartedPulling="2025-12-02 09:01:41.902080644 +0000 UTC m=+5913.072940257" lastFinishedPulling="2025-12-02 09:01:52.017376734 +0000 UTC m=+5923.188236347" observedRunningTime="2025-12-02 09:01:54.217057116 +0000 UTC m=+5925.387916749" watchObservedRunningTime="2025-12-02 09:01:54.221231316 +0000 UTC m=+5925.392090949" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.141618 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:01:58 crc kubenswrapper[4895]: E1202 09:01:58.144086 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.317113 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fp6lz" podUID="6389707d-0e93-4457-ae41-4da59350383e" containerName="ovn-controller" probeResult="failure" output=< Dec 02 09:01:58 crc kubenswrapper[4895]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 09:01:58 crc kubenswrapper[4895]: > Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.405134 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.424565 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kgxnn" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.561701 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fp6lz-config-zl6vw"] Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.563494 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.570356 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.584944 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz-config-zl6vw"] Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.629815 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.630372 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.630442 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.630467 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.630660 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.631048 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbhp\" (UniqueName: \"kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734234 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frbhp\" (UniqueName: \"kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734358 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734417 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734498 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734842 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.734882 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.735586 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.736828 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.755539 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbhp\" (UniqueName: \"kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp\") pod \"ovn-controller-fp6lz-config-zl6vw\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:58 crc kubenswrapper[4895]: I1202 09:01:58.892518 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:01:59 crc kubenswrapper[4895]: I1202 09:01:59.419983 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz-config-zl6vw"] Dec 02 09:02:00 crc kubenswrapper[4895]: I1202 09:02:00.547565 4895 generic.go:334] "Generic (PLEG): container finished" podID="6d473545-1bab-49c2-b773-351b9c9267df" containerID="353690f8148fdff66c772bee85e3488567ccf0cf29834fd7b68b7230298e0c84" exitCode=0 Dec 02 09:02:00 crc kubenswrapper[4895]: I1202 09:02:00.547628 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-zl6vw" event={"ID":"6d473545-1bab-49c2-b773-351b9c9267df","Type":"ContainerDied","Data":"353690f8148fdff66c772bee85e3488567ccf0cf29834fd7b68b7230298e0c84"} Dec 02 09:02:00 crc kubenswrapper[4895]: I1202 09:02:00.547661 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-zl6vw" event={"ID":"6d473545-1bab-49c2-b773-351b9c9267df","Type":"ContainerStarted","Data":"84ef7558518ec79d7e14ee1766009159ed3722a344f779e186650aa1acd510a7"} Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.022879 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-44qhk"] Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.025362 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.027618 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.028253 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.031352 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.039878 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-44qhk"] Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.141443 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data-merged\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.141521 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-scripts\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.141568 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.141595 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e92475e9-c98d-4450-a3ff-d60ee780d43b-hm-ports\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.244484 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-scripts\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.244574 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.244620 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e92475e9-c98d-4450-a3ff-d60ee780d43b-hm-ports\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.244841 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data-merged\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.247190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data-merged\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.247293 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e92475e9-c98d-4450-a3ff-d60ee780d43b-hm-ports\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.251221 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-config-data\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.252999 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e92475e9-c98d-4450-a3ff-d60ee780d43b-scripts\") pod \"octavia-rsyslog-44qhk\" (UID: \"e92475e9-c98d-4450-a3ff-d60ee780d43b\") " pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.363488 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.553390 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.564620 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.576780 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.728137 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.728373 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.746091 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.832628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.832781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.833398 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:01 crc kubenswrapper[4895]: I1202 09:02:01.850022 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config\") pod \"octavia-image-upload-59f8cff499-vk8rv\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.046188 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.114146 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-44qhk"] Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.142572 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.240808 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.241213 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frbhp\" (UniqueName: \"kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.241263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.240911 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.241837 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run" (OuterVolumeSpecName: "var-run") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.247936 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp" (OuterVolumeSpecName: "kube-api-access-frbhp") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "kube-api-access-frbhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.346682 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.346761 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.346792 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts\") pod \"6d473545-1bab-49c2-b773-351b9c9267df\" (UID: \"6d473545-1bab-49c2-b773-351b9c9267df\") " Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.346802 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.347402 4895 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.347421 4895 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.347433 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frbhp\" (UniqueName: \"kubernetes.io/projected/6d473545-1bab-49c2-b773-351b9c9267df-kube-api-access-frbhp\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.347450 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6d473545-1bab-49c2-b773-351b9c9267df-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.347461 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.348173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts" (OuterVolumeSpecName: "scripts") pod "6d473545-1bab-49c2-b773-351b9c9267df" (UID: "6d473545-1bab-49c2-b773-351b9c9267df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.448402 4895 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.448433 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d473545-1bab-49c2-b773-351b9c9267df-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.579590 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.582143 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-44qhk" event={"ID":"e92475e9-c98d-4450-a3ff-d60ee780d43b","Type":"ContainerStarted","Data":"5f3b88faa44a19d71905a8e7f4e07580a0360396ea98ad388ddc2331eaccad88"} Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.583984 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-zl6vw" event={"ID":"6d473545-1bab-49c2-b773-351b9c9267df","Type":"ContainerDied","Data":"84ef7558518ec79d7e14ee1766009159ed3722a344f779e186650aa1acd510a7"} Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.584009 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ef7558518ec79d7e14ee1766009159ed3722a344f779e186650aa1acd510a7" Dec 02 09:02:02 crc kubenswrapper[4895]: I1202 09:02:02.584079 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-zl6vw" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.092498 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-79stx"] Dec 02 09:02:03 crc kubenswrapper[4895]: E1202 09:02:03.093210 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d473545-1bab-49c2-b773-351b9c9267df" containerName="ovn-config" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.093235 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d473545-1bab-49c2-b773-351b9c9267df" containerName="ovn-config" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.093550 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d473545-1bab-49c2-b773-351b9c9267df" containerName="ovn-config" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.094972 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.097845 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.102422 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-79stx"] Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.254820 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fp6lz-config-zl6vw"] Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.271094 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fp6lz-config-zl6vw"] Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.277140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.277572 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.277886 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.278077 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.309176 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fp6lz-config-6h525"] Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.324823 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.334305 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fp6lz" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.334586 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.341252 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz-config-6h525"] Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.381011 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.381412 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.381527 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383600 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rgc\" (UniqueName: \"kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.383951 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.385421 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.418936 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.419515 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.421105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data\") pod \"octavia-db-sync-79stx\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.423554 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486457 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486595 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486653 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rgc\" (UniqueName: \"kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.486811 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.488116 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.488387 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.488709 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.490134 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.491366 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.510221 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rgc\" (UniqueName: \"kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc\") pod \"ovn-controller-fp6lz-config-6h525\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.607523 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerStarted","Data":"ae1097e585307af9cd4ee9ccc806e60e32fc90f20c8f996e6e9877e1898fd592"} Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.669944 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:03 crc kubenswrapper[4895]: I1202 09:02:03.933771 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-79stx"] Dec 02 09:02:04 crc kubenswrapper[4895]: I1202 09:02:04.172160 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fp6lz-config-6h525"] Dec 02 09:02:04 crc kubenswrapper[4895]: W1202 09:02:04.237399 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304eb43e_b503_4343_945e_3d4def10dc47.slice/crio-3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3 WatchSource:0}: Error finding container 3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3: Status 404 returned error can't find the container with id 3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3 Dec 02 09:02:04 crc kubenswrapper[4895]: I1202 09:02:04.629251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-6h525" event={"ID":"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77","Type":"ContainerStarted","Data":"efcf81faca241052cdbc62b7c1ddcd516919915fb9d03c47be4f0a61018f774d"} Dec 02 09:02:04 crc kubenswrapper[4895]: I1202 09:02:04.636201 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-79stx" event={"ID":"304eb43e-b503-4343-945e-3d4def10dc47","Type":"ContainerStarted","Data":"3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3"} Dec 02 09:02:05 crc kubenswrapper[4895]: I1202 09:02:05.154365 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d473545-1bab-49c2-b773-351b9c9267df" path="/var/lib/kubelet/pods/6d473545-1bab-49c2-b773-351b9c9267df/volumes" Dec 02 09:02:05 crc kubenswrapper[4895]: I1202 09:02:05.659975 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-6h525" event={"ID":"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77","Type":"ContainerStarted","Data":"7b331d71f1abc8f919e6533769397559c49bc492192bd67fb08b76f16ff29fcf"} Dec 02 09:02:05 crc kubenswrapper[4895]: I1202 09:02:05.665323 4895 generic.go:334] "Generic (PLEG): container finished" podID="304eb43e-b503-4343-945e-3d4def10dc47" containerID="ef5ddaef6e24d0fd2d197527dbce983ef87f695d5ff1a9dac740e6c0d48a6617" exitCode=0 Dec 02 09:02:05 crc kubenswrapper[4895]: I1202 09:02:05.665363 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-79stx" event={"ID":"304eb43e-b503-4343-945e-3d4def10dc47","Type":"ContainerDied","Data":"ef5ddaef6e24d0fd2d197527dbce983ef87f695d5ff1a9dac740e6c0d48a6617"} Dec 02 09:02:06 crc kubenswrapper[4895]: I1202 09:02:06.678556 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-44qhk" event={"ID":"e92475e9-c98d-4450-a3ff-d60ee780d43b","Type":"ContainerStarted","Data":"92e81dc94047d3c4c25c66a7adf72b64e43603e67cb19595dec5a0aa28b435c1"} Dec 02 09:02:06 crc kubenswrapper[4895]: I1202 09:02:06.684580 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-79stx" event={"ID":"304eb43e-b503-4343-945e-3d4def10dc47","Type":"ContainerStarted","Data":"2748d052530f0be06d2597be6369b313b5eef17691f7a0fe1f240c73614b34a1"} Dec 02 09:02:06 crc kubenswrapper[4895]: I1202 09:02:06.687402 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" containerID="7b331d71f1abc8f919e6533769397559c49bc492192bd67fb08b76f16ff29fcf" exitCode=0 Dec 02 09:02:06 crc kubenswrapper[4895]: I1202 09:02:06.687473 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-6h525" event={"ID":"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77","Type":"ContainerDied","Data":"7b331d71f1abc8f919e6533769397559c49bc492192bd67fb08b76f16ff29fcf"} Dec 02 09:02:06 crc kubenswrapper[4895]: I1202 09:02:06.727014 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-79stx" podStartSLOduration=3.726979373 podStartE2EDuration="3.726979373s" podCreationTimestamp="2025-12-02 09:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:02:06.722375649 +0000 UTC m=+5937.893235262" watchObservedRunningTime="2025-12-02 09:02:06.726979373 +0000 UTC m=+5937.897838986" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.114229 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.180560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.180757 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.180768 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.180986 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.181262 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.181370 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.181441 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.181540 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6rgc\" (UniqueName: \"kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc\") pod \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\" (UID: \"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77\") " Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.181602 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run" (OuterVolumeSpecName: "var-run") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.182379 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.182612 4895 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.182639 4895 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.182653 4895 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.182903 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts" (OuterVolumeSpecName: "scripts") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.197563 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc" (OuterVolumeSpecName: "kube-api-access-b6rgc") pod "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" (UID: "3a0eef60-7fd4-4ad1-b235-6f8c691d6a77"). InnerVolumeSpecName "kube-api-access-b6rgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.287888 4895 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.302898 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6rgc\" (UniqueName: \"kubernetes.io/projected/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-kube-api-access-b6rgc\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.302930 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.710203 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fp6lz-config-6h525" Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.714409 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fp6lz-config-6h525" event={"ID":"3a0eef60-7fd4-4ad1-b235-6f8c691d6a77","Type":"ContainerDied","Data":"efcf81faca241052cdbc62b7c1ddcd516919915fb9d03c47be4f0a61018f774d"} Dec 02 09:02:07 crc kubenswrapper[4895]: I1202 09:02:07.714523 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efcf81faca241052cdbc62b7c1ddcd516919915fb9d03c47be4f0a61018f774d" Dec 02 09:02:08 crc kubenswrapper[4895]: I1202 09:02:08.220442 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fp6lz-config-6h525"] Dec 02 09:02:08 crc kubenswrapper[4895]: I1202 09:02:08.234587 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fp6lz-config-6h525"] Dec 02 09:02:08 crc kubenswrapper[4895]: I1202 09:02:08.720177 4895 generic.go:334] "Generic (PLEG): container finished" podID="e92475e9-c98d-4450-a3ff-d60ee780d43b" containerID="92e81dc94047d3c4c25c66a7adf72b64e43603e67cb19595dec5a0aa28b435c1" exitCode=0 Dec 02 09:02:08 crc kubenswrapper[4895]: I1202 09:02:08.720233 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-44qhk" event={"ID":"e92475e9-c98d-4450-a3ff-d60ee780d43b","Type":"ContainerDied","Data":"92e81dc94047d3c4c25c66a7adf72b64e43603e67cb19595dec5a0aa28b435c1"} Dec 02 09:02:09 crc kubenswrapper[4895]: I1202 09:02:09.155592 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" path="/var/lib/kubelet/pods/3a0eef60-7fd4-4ad1-b235-6f8c691d6a77/volumes" Dec 02 09:02:09 crc kubenswrapper[4895]: I1202 09:02:09.732109 4895 generic.go:334] "Generic (PLEG): container finished" podID="304eb43e-b503-4343-945e-3d4def10dc47" containerID="2748d052530f0be06d2597be6369b313b5eef17691f7a0fe1f240c73614b34a1" exitCode=0 Dec 02 09:02:09 crc kubenswrapper[4895]: I1202 09:02:09.732143 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-79stx" event={"ID":"304eb43e-b503-4343-945e-3d4def10dc47","Type":"ContainerDied","Data":"2748d052530f0be06d2597be6369b313b5eef17691f7a0fe1f240c73614b34a1"} Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.346879 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.435340 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data\") pod \"304eb43e-b503-4343-945e-3d4def10dc47\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.435495 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged\") pod \"304eb43e-b503-4343-945e-3d4def10dc47\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.435677 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts\") pod \"304eb43e-b503-4343-945e-3d4def10dc47\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.435720 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle\") pod \"304eb43e-b503-4343-945e-3d4def10dc47\" (UID: \"304eb43e-b503-4343-945e-3d4def10dc47\") " Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.445898 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data" (OuterVolumeSpecName: "config-data") pod "304eb43e-b503-4343-945e-3d4def10dc47" (UID: "304eb43e-b503-4343-945e-3d4def10dc47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.446396 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts" (OuterVolumeSpecName: "scripts") pod "304eb43e-b503-4343-945e-3d4def10dc47" (UID: "304eb43e-b503-4343-945e-3d4def10dc47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.462141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "304eb43e-b503-4343-945e-3d4def10dc47" (UID: "304eb43e-b503-4343-945e-3d4def10dc47"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.471340 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "304eb43e-b503-4343-945e-3d4def10dc47" (UID: "304eb43e-b503-4343-945e-3d4def10dc47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.538055 4895 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/304eb43e-b503-4343-945e-3d4def10dc47-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.538100 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.538113 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.538123 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304eb43e-b503-4343-945e-3d4def10dc47-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.780612 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-79stx" event={"ID":"304eb43e-b503-4343-945e-3d4def10dc47","Type":"ContainerDied","Data":"3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3"} Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.781170 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fde4f5826381a699754ef3158ac262606a143880534faf3ac3b8422942f05d3" Dec 02 09:02:12 crc kubenswrapper[4895]: I1202 09:02:12.780699 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-79stx" Dec 02 09:02:13 crc kubenswrapper[4895]: I1202 09:02:13.143901 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:02:13 crc kubenswrapper[4895]: E1202 09:02:13.144240 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:02:13 crc kubenswrapper[4895]: I1202 09:02:13.798505 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-44qhk" event={"ID":"e92475e9-c98d-4450-a3ff-d60ee780d43b","Type":"ContainerStarted","Data":"0411627f764ddccce9b45b96b5c8fa70ed21bd960f79db15503adf55b871342b"} Dec 02 09:02:13 crc kubenswrapper[4895]: I1202 09:02:13.799079 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:13 crc kubenswrapper[4895]: I1202 09:02:13.801250 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerStarted","Data":"a6cd0cd7c663c87fa9c28077ae75455c33456c2bc5422cb78a22bdbbdeb55460"} Dec 02 09:02:13 crc kubenswrapper[4895]: I1202 09:02:13.853777 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-44qhk" podStartSLOduration=1.651575966 podStartE2EDuration="12.853719537s" podCreationTimestamp="2025-12-02 09:02:01 +0000 UTC" firstStartedPulling="2025-12-02 09:02:02.110387999 +0000 UTC m=+5933.281247612" lastFinishedPulling="2025-12-02 09:02:13.31253157 +0000 UTC m=+5944.483391183" observedRunningTime="2025-12-02 09:02:13.822845036 +0000 UTC m=+5944.993704669" watchObservedRunningTime="2025-12-02 09:02:13.853719537 +0000 UTC m=+5945.024579160" Dec 02 09:02:17 crc kubenswrapper[4895]: I1202 09:02:17.621778 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:02:17 crc kubenswrapper[4895]: I1202 09:02:17.627253 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-7b55b799f8-g962x" Dec 02 09:02:17 crc kubenswrapper[4895]: I1202 09:02:17.843364 4895 generic.go:334] "Generic (PLEG): container finished" podID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerID="a6cd0cd7c663c87fa9c28077ae75455c33456c2bc5422cb78a22bdbbdeb55460" exitCode=0 Dec 02 09:02:17 crc kubenswrapper[4895]: I1202 09:02:17.843412 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerDied","Data":"a6cd0cd7c663c87fa9c28077ae75455c33456c2bc5422cb78a22bdbbdeb55460"} Dec 02 09:02:19 crc kubenswrapper[4895]: I1202 09:02:19.871174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerStarted","Data":"581592b8db0f7e6fc0746b21713df9967778e52eeb94cf21b4d594384c8b89a5"} Dec 02 09:02:19 crc kubenswrapper[4895]: I1202 09:02:19.904579 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" podStartSLOduration=2.666914763 podStartE2EDuration="18.90455355s" podCreationTimestamp="2025-12-02 09:02:01 +0000 UTC" firstStartedPulling="2025-12-02 09:02:02.585313154 +0000 UTC m=+5933.756172767" lastFinishedPulling="2025-12-02 09:02:18.822951941 +0000 UTC m=+5949.993811554" observedRunningTime="2025-12-02 09:02:19.891887625 +0000 UTC m=+5951.062747258" watchObservedRunningTime="2025-12-02 09:02:19.90455355 +0000 UTC m=+5951.075413163" Dec 02 09:02:24 crc kubenswrapper[4895]: I1202 09:02:24.141789 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:02:24 crc kubenswrapper[4895]: E1202 09:02:24.142870 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:02:30 crc kubenswrapper[4895]: I1202 09:02:30.672434 4895 scope.go:117] "RemoveContainer" containerID="a02563ddf1b97e68acdff8249884b6fec1ebcc0a9ad8d06bfafb51afebc52679" Dec 02 09:02:31 crc kubenswrapper[4895]: I1202 09:02:31.406511 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-44qhk" Dec 02 09:02:37 crc kubenswrapper[4895]: I1202 09:02:37.141159 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:02:37 crc kubenswrapper[4895]: E1202 09:02:37.142144 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:02:41 crc kubenswrapper[4895]: I1202 09:02:41.626721 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:41 crc kubenswrapper[4895]: I1202 09:02:41.627931 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="octavia-amphora-httpd" containerID="cri-o://581592b8db0f7e6fc0746b21713df9967778e52eeb94cf21b4d594384c8b89a5" gracePeriod=30 Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.152640 4895 generic.go:334] "Generic (PLEG): container finished" podID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerID="581592b8db0f7e6fc0746b21713df9967778e52eeb94cf21b4d594384c8b89a5" exitCode=0 Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.153144 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerDied","Data":"581592b8db0f7e6fc0746b21713df9967778e52eeb94cf21b4d594384c8b89a5"} Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.257041 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.361457 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image\") pod \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.361902 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config\") pod \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\" (UID: \"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2\") " Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.414068 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" (UID: "93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.447553 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" (UID: "93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.466246 4895 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:42 crc kubenswrapper[4895]: I1202 09:02:42.466286 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.167491 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" event={"ID":"93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2","Type":"ContainerDied","Data":"ae1097e585307af9cd4ee9ccc806e60e32fc90f20c8f996e6e9877e1898fd592"} Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.167888 4895 scope.go:117] "RemoveContainer" containerID="581592b8db0f7e6fc0746b21713df9967778e52eeb94cf21b4d594384c8b89a5" Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.167631 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-vk8rv" Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.200332 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.206941 4895 scope.go:117] "RemoveContainer" containerID="a6cd0cd7c663c87fa9c28077ae75455c33456c2bc5422cb78a22bdbbdeb55460" Dec 02 09:02:43 crc kubenswrapper[4895]: I1202 09:02:43.210487 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-vk8rv"] Dec 02 09:02:45 crc kubenswrapper[4895]: I1202 09:02:45.156622 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" path="/var/lib/kubelet/pods/93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2/volumes" Dec 02 09:02:51 crc kubenswrapper[4895]: I1202 09:02:51.141969 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:02:51 crc kubenswrapper[4895]: E1202 09:02:51.143692 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.142350 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.144368 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536020 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-88dkz"] Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.536735 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" containerName="ovn-config" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536785 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" containerName="ovn-config" Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.536805 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="init" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536814 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="init" Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.536837 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304eb43e-b503-4343-945e-3d4def10dc47" containerName="octavia-db-sync" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536848 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="304eb43e-b503-4343-945e-3d4def10dc47" containerName="octavia-db-sync" Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.536867 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304eb43e-b503-4343-945e-3d4def10dc47" containerName="init" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536875 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="304eb43e-b503-4343-945e-3d4def10dc47" containerName="init" Dec 02 09:03:05 crc kubenswrapper[4895]: E1202 09:03:05.536904 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="octavia-amphora-httpd" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.536914 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="octavia-amphora-httpd" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.537214 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a0eef60-7fd4-4ad1-b235-6f8c691d6a77" containerName="ovn-config" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.537239 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="304eb43e-b503-4343-945e-3d4def10dc47" containerName="octavia-db-sync" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.537281 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ff3ab6-3da6-4d88-b9b8-e292c0c4f1a2" containerName="octavia-amphora-httpd" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.541317 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.574185 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.574455 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.575057 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.578140 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-88dkz"] Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603320 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7898eb31-07a5-4f75-8646-237041e8d08e-config-data-merged\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603427 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-scripts\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603467 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-config-data\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603506 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7898eb31-07a5-4f75-8646-237041e8d08e-hm-ports\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603538 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-amphora-certs\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.603576 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-combined-ca-bundle\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.705904 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-scripts\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.705982 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-config-data\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.706032 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7898eb31-07a5-4f75-8646-237041e8d08e-hm-ports\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.706071 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-amphora-certs\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.706123 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-combined-ca-bundle\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.706228 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7898eb31-07a5-4f75-8646-237041e8d08e-config-data-merged\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.706896 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7898eb31-07a5-4f75-8646-237041e8d08e-config-data-merged\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.708178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7898eb31-07a5-4f75-8646-237041e8d08e-hm-ports\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.714704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-amphora-certs\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.714840 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-combined-ca-bundle\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.715712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-scripts\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.723236 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7898eb31-07a5-4f75-8646-237041e8d08e-config-data\") pod \"octavia-healthmanager-88dkz\" (UID: \"7898eb31-07a5-4f75-8646-237041e8d08e\") " pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:05 crc kubenswrapper[4895]: I1202 09:03:05.916654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:06 crc kubenswrapper[4895]: I1202 09:03:06.517377 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-88dkz"] Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.013045 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-b4mmm"] Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.015647 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.019076 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.026162 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.031516 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-b4mmm"] Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/38e2512c-e02b-4088-a3ae-f979fb28e4b7-hm-ports\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155344 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data-merged\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155409 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-amphora-certs\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-scripts\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155451 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.155520 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-combined-ca-bundle\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.257256 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-scripts\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.258439 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.258608 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-combined-ca-bundle\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.258729 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/38e2512c-e02b-4088-a3ae-f979fb28e4b7-hm-ports\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.258801 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data-merged\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.258900 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-amphora-certs\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.259877 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data-merged\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.260603 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/38e2512c-e02b-4088-a3ae-f979fb28e4b7-hm-ports\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.266248 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-amphora-certs\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.266409 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-config-data\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.266475 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-combined-ca-bundle\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.267704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38e2512c-e02b-4088-a3ae-f979fb28e4b7-scripts\") pod \"octavia-housekeeping-b4mmm\" (UID: \"38e2512c-e02b-4088-a3ae-f979fb28e4b7\") " pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.338925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.425588 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-88dkz" event={"ID":"7898eb31-07a5-4f75-8646-237041e8d08e","Type":"ContainerStarted","Data":"3cf03dca8114201c877b5ec64602d48591d864018e76899a4fc6de62d541ee09"} Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.426540 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-88dkz" event={"ID":"7898eb31-07a5-4f75-8646-237041e8d08e","Type":"ContainerStarted","Data":"3ba8a8f18380608bf865fd2d1864a37810ddd19104fb4d19d4b1121fcb2a32f7"} Dec 02 09:03:07 crc kubenswrapper[4895]: I1202 09:03:07.940298 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-b4mmm"] Dec 02 09:03:07 crc kubenswrapper[4895]: W1202 09:03:07.943210 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38e2512c_e02b_4088_a3ae_f979fb28e4b7.slice/crio-aba1c1f00eae3db3f8f13239105ee80eb8c2391b86d2e28e3fb9a03d8ee29281 WatchSource:0}: Error finding container aba1c1f00eae3db3f8f13239105ee80eb8c2391b86d2e28e3fb9a03d8ee29281: Status 404 returned error can't find the container with id aba1c1f00eae3db3f8f13239105ee80eb8c2391b86d2e28e3fb9a03d8ee29281 Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.117383 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-zkk7c"] Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.119429 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.121969 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.122370 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.133080 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-zkk7c"] Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.183642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data-merged\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.183733 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-amphora-certs\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.196129 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-combined-ca-bundle\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.196175 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.196215 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-scripts\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.196400 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/64a44515-da02-415f-9be2-5fcc1e976ff7-hm-ports\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.298935 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/64a44515-da02-415f-9be2-5fcc1e976ff7-hm-ports\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299073 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data-merged\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299136 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-amphora-certs\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299251 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-combined-ca-bundle\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299281 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299309 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-scripts\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.299636 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data-merged\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.300243 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/64a44515-da02-415f-9be2-5fcc1e976ff7-hm-ports\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.305327 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-combined-ca-bundle\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.305973 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-amphora-certs\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.306298 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-config-data\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.310710 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64a44515-da02-415f-9be2-5fcc1e976ff7-scripts\") pod \"octavia-worker-zkk7c\" (UID: \"64a44515-da02-415f-9be2-5fcc1e976ff7\") " pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.438424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b4mmm" event={"ID":"38e2512c-e02b-4088-a3ae-f979fb28e4b7","Type":"ContainerStarted","Data":"aba1c1f00eae3db3f8f13239105ee80eb8c2391b86d2e28e3fb9a03d8ee29281"} Dec 02 09:03:08 crc kubenswrapper[4895]: I1202 09:03:08.444308 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:09 crc kubenswrapper[4895]: I1202 09:03:09.075159 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-zkk7c"] Dec 02 09:03:09 crc kubenswrapper[4895]: W1202 09:03:09.080450 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64a44515_da02_415f_9be2_5fcc1e976ff7.slice/crio-9ce4c4bcc54e00e0134ff868569f3dd91dd7d767725cab7ca5aade101a26ab9e WatchSource:0}: Error finding container 9ce4c4bcc54e00e0134ff868569f3dd91dd7d767725cab7ca5aade101a26ab9e: Status 404 returned error can't find the container with id 9ce4c4bcc54e00e0134ff868569f3dd91dd7d767725cab7ca5aade101a26ab9e Dec 02 09:03:09 crc kubenswrapper[4895]: I1202 09:03:09.452729 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zkk7c" event={"ID":"64a44515-da02-415f-9be2-5fcc1e976ff7","Type":"ContainerStarted","Data":"9ce4c4bcc54e00e0134ff868569f3dd91dd7d767725cab7ca5aade101a26ab9e"} Dec 02 09:03:09 crc kubenswrapper[4895]: I1202 09:03:09.455455 4895 generic.go:334] "Generic (PLEG): container finished" podID="7898eb31-07a5-4f75-8646-237041e8d08e" containerID="3cf03dca8114201c877b5ec64602d48591d864018e76899a4fc6de62d541ee09" exitCode=0 Dec 02 09:03:09 crc kubenswrapper[4895]: I1202 09:03:09.455505 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-88dkz" event={"ID":"7898eb31-07a5-4f75-8646-237041e8d08e","Type":"ContainerDied","Data":"3cf03dca8114201c877b5ec64602d48591d864018e76899a4fc6de62d541ee09"} Dec 02 09:03:10 crc kubenswrapper[4895]: I1202 09:03:10.468522 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b4mmm" event={"ID":"38e2512c-e02b-4088-a3ae-f979fb28e4b7","Type":"ContainerStarted","Data":"8d3017cf196a805135f162527630c7275c8d3e4e89a51ac8089f536efc9a11c6"} Dec 02 09:03:10 crc kubenswrapper[4895]: I1202 09:03:10.474625 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-88dkz" event={"ID":"7898eb31-07a5-4f75-8646-237041e8d08e","Type":"ContainerStarted","Data":"a9611a5c5ed1b5786851b85873a7aea74d32c6ced9267bab1ca75177ca28caa7"} Dec 02 09:03:10 crc kubenswrapper[4895]: I1202 09:03:10.474985 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:10 crc kubenswrapper[4895]: I1202 09:03:10.519197 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-88dkz" podStartSLOduration=5.51915948 podStartE2EDuration="5.51915948s" podCreationTimestamp="2025-12-02 09:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:03:10.506335601 +0000 UTC m=+6001.677195234" watchObservedRunningTime="2025-12-02 09:03:10.51915948 +0000 UTC m=+6001.690019093" Dec 02 09:03:11 crc kubenswrapper[4895]: I1202 09:03:11.494022 4895 generic.go:334] "Generic (PLEG): container finished" podID="38e2512c-e02b-4088-a3ae-f979fb28e4b7" containerID="8d3017cf196a805135f162527630c7275c8d3e4e89a51ac8089f536efc9a11c6" exitCode=0 Dec 02 09:03:11 crc kubenswrapper[4895]: I1202 09:03:11.494270 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b4mmm" event={"ID":"38e2512c-e02b-4088-a3ae-f979fb28e4b7","Type":"ContainerDied","Data":"8d3017cf196a805135f162527630c7275c8d3e4e89a51ac8089f536efc9a11c6"} Dec 02 09:03:12 crc kubenswrapper[4895]: I1202 09:03:12.514175 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-b4mmm" event={"ID":"38e2512c-e02b-4088-a3ae-f979fb28e4b7","Type":"ContainerStarted","Data":"0f1b2abfacd9c90375f7d804f3cf1a22e5b41291c1c52f73965d672ab720ad97"} Dec 02 09:03:12 crc kubenswrapper[4895]: I1202 09:03:12.514905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:12 crc kubenswrapper[4895]: I1202 09:03:12.561089 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-b4mmm" podStartSLOduration=4.684343859 podStartE2EDuration="6.561063174s" podCreationTimestamp="2025-12-02 09:03:06 +0000 UTC" firstStartedPulling="2025-12-02 09:03:07.946843165 +0000 UTC m=+5999.117702778" lastFinishedPulling="2025-12-02 09:03:09.82356248 +0000 UTC m=+6000.994422093" observedRunningTime="2025-12-02 09:03:12.552080065 +0000 UTC m=+6003.722939698" watchObservedRunningTime="2025-12-02 09:03:12.561063174 +0000 UTC m=+6003.731922787" Dec 02 09:03:13 crc kubenswrapper[4895]: I1202 09:03:13.527442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zkk7c" event={"ID":"64a44515-da02-415f-9be2-5fcc1e976ff7","Type":"ContainerStarted","Data":"c217d859d0839753abb3389c702728305b3ea08b0df4bbd9ef9e188d76072f93"} Dec 02 09:03:14 crc kubenswrapper[4895]: I1202 09:03:14.543662 4895 generic.go:334] "Generic (PLEG): container finished" podID="64a44515-da02-415f-9be2-5fcc1e976ff7" containerID="c217d859d0839753abb3389c702728305b3ea08b0df4bbd9ef9e188d76072f93" exitCode=0 Dec 02 09:03:14 crc kubenswrapper[4895]: I1202 09:03:14.543785 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zkk7c" event={"ID":"64a44515-da02-415f-9be2-5fcc1e976ff7","Type":"ContainerDied","Data":"c217d859d0839753abb3389c702728305b3ea08b0df4bbd9ef9e188d76072f93"} Dec 02 09:03:15 crc kubenswrapper[4895]: I1202 09:03:15.567714 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-zkk7c" event={"ID":"64a44515-da02-415f-9be2-5fcc1e976ff7","Type":"ContainerStarted","Data":"c8535ee2d031840dccdf48689713d5b9f0d630e916ddcd6a6d03d82da050a0e5"} Dec 02 09:03:15 crc kubenswrapper[4895]: I1202 09:03:15.570855 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:15 crc kubenswrapper[4895]: I1202 09:03:15.600028 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-zkk7c" podStartSLOduration=3.902605249 podStartE2EDuration="7.600000426s" podCreationTimestamp="2025-12-02 09:03:08 +0000 UTC" firstStartedPulling="2025-12-02 09:03:09.083461045 +0000 UTC m=+6000.254320658" lastFinishedPulling="2025-12-02 09:03:12.780856212 +0000 UTC m=+6003.951715835" observedRunningTime="2025-12-02 09:03:15.593272757 +0000 UTC m=+6006.764132410" watchObservedRunningTime="2025-12-02 09:03:15.600000426 +0000 UTC m=+6006.770860039" Dec 02 09:03:19 crc kubenswrapper[4895]: I1202 09:03:19.152141 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:03:19 crc kubenswrapper[4895]: I1202 09:03:19.622456 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a"} Dec 02 09:03:20 crc kubenswrapper[4895]: I1202 09:03:20.947020 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-88dkz" Dec 02 09:03:22 crc kubenswrapper[4895]: I1202 09:03:22.378767 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-b4mmm" Dec 02 09:03:23 crc kubenswrapper[4895]: I1202 09:03:23.485806 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-zkk7c" Dec 02 09:03:34 crc kubenswrapper[4895]: E1202 09:03:34.293149 4895 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:34466->38.102.83.13:37351: write tcp 38.102.83.13:34466->38.102.83.13:37351: write: broken pipe Dec 02 09:04:00 crc kubenswrapper[4895]: I1202 09:04:00.048666 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-snb7q"] Dec 02 09:04:00 crc kubenswrapper[4895]: I1202 09:04:00.058591 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-345f-account-create-update-qpz2w"] Dec 02 09:04:00 crc kubenswrapper[4895]: I1202 09:04:00.075007 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-345f-account-create-update-qpz2w"] Dec 02 09:04:00 crc kubenswrapper[4895]: I1202 09:04:00.086087 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-snb7q"] Dec 02 09:04:01 crc kubenswrapper[4895]: I1202 09:04:01.160800 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec9f4cd-d497-4bb1-aea0-bb28977e971d" path="/var/lib/kubelet/pods/5ec9f4cd-d497-4bb1-aea0-bb28977e971d/volumes" Dec 02 09:04:01 crc kubenswrapper[4895]: I1202 09:04:01.163119 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20cf098-8a77-4677-959a-9264e799bb6a" path="/var/lib/kubelet/pods/c20cf098-8a77-4677-959a-9264e799bb6a/volumes" Dec 02 09:04:06 crc kubenswrapper[4895]: I1202 09:04:06.041373 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6j44z"] Dec 02 09:04:06 crc kubenswrapper[4895]: I1202 09:04:06.056363 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6j44z"] Dec 02 09:04:07 crc kubenswrapper[4895]: I1202 09:04:07.163111 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11de1ef0-4dea-4745-9d17-8fcc2a89f38c" path="/var/lib/kubelet/pods/11de1ef0-4dea-4745-9d17-8fcc2a89f38c/volumes" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.222374 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.232138 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.235883 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-qgjpf" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.236191 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.236432 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.237725 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.247213 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.294626 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.295294 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-log" containerID="cri-o://a467d7fd4acf073b3008fe8d8d3c8690e72c4989f5c7e9a09a9d89db485e1d74" gracePeriod=30 Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.295942 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-httpd" containerID="cri-o://da1c3a95269074d1f09180dc4ae3943a96d109e287418ba82bc8eb967de3da9b" gracePeriod=30 Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.308126 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.308418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z48xd\" (UniqueName: \"kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.308464 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.308861 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.308925 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.359471 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.370864 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.409120 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.409487 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-log" containerID="cri-o://aa831bd876d7a69f0f63864e7275c900c7bd4478df5baff23116f39ba661ed6a" gracePeriod=30 Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.409704 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-httpd" containerID="cri-o://35d0b895398752a7799047c2a0e5d3eb54f05528de7ecc20484d047f9950c08a" gracePeriod=30 Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.414637 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.414783 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.414820 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.414899 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415018 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz8vh\" (UniqueName: \"kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415192 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415292 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z48xd\" (UniqueName: \"kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.415329 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.416003 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.417639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.433506 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.437460 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.438214 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.438932 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z48xd\" (UniqueName: \"kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd\") pod \"horizon-68f4ccf997-tvwxr\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.517663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.517835 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.517922 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.518031 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.518080 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz8vh\" (UniqueName: \"kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.519003 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.520407 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.520986 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.527939 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.540846 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz8vh\" (UniqueName: \"kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh\") pod \"horizon-7b776dc549-zrcws\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.554720 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.698279 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.947649 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:12 crc kubenswrapper[4895]: I1202 09:04:12.994002 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.007992 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.018453 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.036038 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.036126 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s7ww\" (UniqueName: \"kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.036241 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.036354 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.036453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.095295 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.113224 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.144498 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.144649 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.145641 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.145686 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s7ww\" (UniqueName: \"kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.145762 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.147239 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.148794 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.153773 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.161566 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.168466 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s7ww\" (UniqueName: \"kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww\") pod \"horizon-76bcdfd5df-n7r77\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.367294 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.376766 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerStarted","Data":"23240f355d8ff9165b633599479172475c8f23ce7f8ccba0a94d8246f19a8ae6"} Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.383339 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerID="a467d7fd4acf073b3008fe8d8d3c8690e72c4989f5c7e9a09a9d89db485e1d74" exitCode=143 Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.383426 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerDied","Data":"a467d7fd4acf073b3008fe8d8d3c8690e72c4989f5c7e9a09a9d89db485e1d74"} Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.389295 4895 generic.go:334] "Generic (PLEG): container finished" podID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerID="aa831bd876d7a69f0f63864e7275c900c7bd4478df5baff23116f39ba661ed6a" exitCode=143 Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.389324 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerDied","Data":"aa831bd876d7a69f0f63864e7275c900c7bd4478df5baff23116f39ba661ed6a"} Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.441476 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:04:13 crc kubenswrapper[4895]: W1202 09:04:13.463711 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099f2897_7daf_4053_83a3_caacbf2ea78a.slice/crio-511acb7a55475c9bc82ad3767cb0dd3a4b6ebc895ee79a407926fbcfd9b62ec4 WatchSource:0}: Error finding container 511acb7a55475c9bc82ad3767cb0dd3a4b6ebc895ee79a407926fbcfd9b62ec4: Status 404 returned error can't find the container with id 511acb7a55475c9bc82ad3767cb0dd3a4b6ebc895ee79a407926fbcfd9b62ec4 Dec 02 09:04:13 crc kubenswrapper[4895]: I1202 09:04:13.898409 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:04:13 crc kubenswrapper[4895]: W1202 09:04:13.910891 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e401d14_d2d7_4b54_a61d_c40b8125462b.slice/crio-002b9f55b24c432f4eeb0d1842b7617a86fbcc59d1d43fd8c788e2bdc89e6a56 WatchSource:0}: Error finding container 002b9f55b24c432f4eeb0d1842b7617a86fbcc59d1d43fd8c788e2bdc89e6a56: Status 404 returned error can't find the container with id 002b9f55b24c432f4eeb0d1842b7617a86fbcc59d1d43fd8c788e2bdc89e6a56 Dec 02 09:04:14 crc kubenswrapper[4895]: I1202 09:04:14.403493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerStarted","Data":"511acb7a55475c9bc82ad3767cb0dd3a4b6ebc895ee79a407926fbcfd9b62ec4"} Dec 02 09:04:14 crc kubenswrapper[4895]: I1202 09:04:14.406203 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerStarted","Data":"002b9f55b24c432f4eeb0d1842b7617a86fbcc59d1d43fd8c788e2bdc89e6a56"} Dec 02 09:04:16 crc kubenswrapper[4895]: I1202 09:04:16.435360 4895 generic.go:334] "Generic (PLEG): container finished" podID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerID="da1c3a95269074d1f09180dc4ae3943a96d109e287418ba82bc8eb967de3da9b" exitCode=0 Dec 02 09:04:16 crc kubenswrapper[4895]: I1202 09:04:16.435439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerDied","Data":"da1c3a95269074d1f09180dc4ae3943a96d109e287418ba82bc8eb967de3da9b"} Dec 02 09:04:16 crc kubenswrapper[4895]: I1202 09:04:16.441701 4895 generic.go:334] "Generic (PLEG): container finished" podID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerID="35d0b895398752a7799047c2a0e5d3eb54f05528de7ecc20484d047f9950c08a" exitCode=0 Dec 02 09:04:16 crc kubenswrapper[4895]: I1202 09:04:16.441753 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerDied","Data":"35d0b895398752a7799047c2a0e5d3eb54f05528de7ecc20484d047f9950c08a"} Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.227296 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.233060 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298006 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298148 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8flz\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298255 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298282 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298351 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298370 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298409 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298447 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298465 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.298509 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data\") pod \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\" (UID: \"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.303180 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs" (OuterVolumeSpecName: "logs") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.305967 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.308656 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph" (OuterVolumeSpecName: "ceph") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.319836 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph" (OuterVolumeSpecName: "ceph") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.321889 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts" (OuterVolumeSpecName: "scripts") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.322027 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts" (OuterVolumeSpecName: "scripts") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.325030 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz" (OuterVolumeSpecName: "kube-api-access-s8flz") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "kube-api-access-s8flz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.361942 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.362049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.379713 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data" (OuterVolumeSpecName: "config-data") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.381336 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data" (OuterVolumeSpecName: "config-data") pod "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" (UID: "2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.400670 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzttl\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.400818 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.400849 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run\") pod \"773bc693-f07d-4938-8980-3099a7dbc5dd\" (UID: \"773bc693-f07d-4938-8980-3099a7dbc5dd\") " Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401379 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401399 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8flz\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-kube-api-access-s8flz\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401409 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401419 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401431 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401439 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401448 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401455 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401464 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bc693-f07d-4938-8980-3099a7dbc5dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401444 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401471 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401530 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.401477 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs" (OuterVolumeSpecName: "logs") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.403961 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl" (OuterVolumeSpecName: "kube-api-access-mzttl") pod "773bc693-f07d-4938-8980-3099a7dbc5dd" (UID: "773bc693-f07d-4938-8980-3099a7dbc5dd"). InnerVolumeSpecName "kube-api-access-mzttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.496530 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bc693-f07d-4938-8980-3099a7dbc5dd","Type":"ContainerDied","Data":"cd8945dc3b543224fde2d78afdbbec6fb4ee5d8f98340ca7e2432d547ad0a045"} Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.496593 4895 scope.go:117] "RemoveContainer" containerID="35d0b895398752a7799047c2a0e5d3eb54f05528de7ecc20484d047f9950c08a" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.496791 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.501149 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6","Type":"ContainerDied","Data":"1a7bbe60536d3d5d4eb8933eb461adebc6a2b62b76d6bb90727b5b313ea906f1"} Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.501259 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.503109 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzttl\" (UniqueName: \"kubernetes.io/projected/773bc693-f07d-4938-8980-3099a7dbc5dd-kube-api-access-mzttl\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.503168 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.503185 4895 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bc693-f07d-4938-8980-3099a7dbc5dd-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.560812 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.563267 4895 scope.go:117] "RemoveContainer" containerID="aa831bd876d7a69f0f63864e7275c900c7bd4478df5baff23116f39ba661ed6a" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.573140 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.582198 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.590251 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: E1202 09:04:21.590834 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.590856 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: E1202 09:04:21.590883 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.590894 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: E1202 09:04:21.590931 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.590940 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: E1202 09:04:21.590954 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.590965 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.591202 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.591238 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-httpd" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.591264 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.591282 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" containerName="glance-log" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.592657 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.595476 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.595692 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-thr6k" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.595903 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.596382 4895 scope.go:117] "RemoveContainer" containerID="da1c3a95269074d1f09180dc4ae3943a96d109e287418ba82bc8eb967de3da9b" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.609960 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716216 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716482 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-logs\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716504 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jckb7\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-kube-api-access-jckb7\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716642 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716662 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.716687 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.721917 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.773992 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.778621 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.785727 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.797857 4895 scope.go:117] "RemoveContainer" containerID="a467d7fd4acf073b3008fe8d8d3c8690e72c4989f5c7e9a09a9d89db485e1d74" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822389 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822467 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-logs\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822492 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jckb7\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-kube-api-access-jckb7\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822618 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822770 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822801 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.822840 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.823919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.824067 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f19bb559-c258-498f-9132-7ee9ea57db14-logs\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.826502 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.831379 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.831534 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.831775 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.836706 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19bb559-c258-498f-9132-7ee9ea57db14-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.847907 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jckb7\" (UniqueName: \"kubernetes.io/projected/f19bb559-c258-498f-9132-7ee9ea57db14-kube-api-access-jckb7\") pod \"glance-default-internal-api-0\" (UID: \"f19bb559-c258-498f-9132-7ee9ea57db14\") " pod="openstack/glance-default-internal-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.924426 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-logs\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.924482 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-ceph\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.924820 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfh4t\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-kube-api-access-kfh4t\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.924883 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.924976 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.925071 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:21 crc kubenswrapper[4895]: I1202 09:04:21.925232 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026428 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfh4t\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-kube-api-access-kfh4t\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026489 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026532 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026577 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026614 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026661 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-logs\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.026702 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-ceph\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.027889 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-logs\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.028087 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2f09559d-29ed-400b-8069-4684c4d060cd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.031136 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-ceph\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.031247 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.033496 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-scripts\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.037068 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f09559d-29ed-400b-8069-4684c4d060cd-config-data\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.043355 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.050006 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfh4t\" (UniqueName: \"kubernetes.io/projected/2f09559d-29ed-400b-8069-4684c4d060cd-kube-api-access-kfh4t\") pod \"glance-default-external-api-0\" (UID: \"2f09559d-29ed-400b-8069-4684c4d060cd\") " pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.105549 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.515283 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerStarted","Data":"fa8729f512b9c8272b69f5522c401d70e95abfc6e33cd40ed9723c82f3339531"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.516059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerStarted","Data":"e93438b53e54936015836510cdc6cb9b32305274e5cc526bd00283525667c6c5"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.515603 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68f4ccf997-tvwxr" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon" containerID="cri-o://fa8729f512b9c8272b69f5522c401d70e95abfc6e33cd40ed9723c82f3339531" gracePeriod=30 Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.515392 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68f4ccf997-tvwxr" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon-log" containerID="cri-o://e93438b53e54936015836510cdc6cb9b32305274e5cc526bd00283525667c6c5" gracePeriod=30 Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.526437 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerStarted","Data":"2b684b253b769194b8612cbf5954882e0ee204e9fd5fd536a695dc7c9daece5f"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.526484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerStarted","Data":"5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.532548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerStarted","Data":"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.532586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerStarted","Data":"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697"} Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.543401 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68f4ccf997-tvwxr" podStartSLOduration=2.398451758 podStartE2EDuration="10.543382458s" podCreationTimestamp="2025-12-02 09:04:12 +0000 UTC" firstStartedPulling="2025-12-02 09:04:13.094984487 +0000 UTC m=+6064.265844110" lastFinishedPulling="2025-12-02 09:04:21.239915197 +0000 UTC m=+6072.410774810" observedRunningTime="2025-12-02 09:04:22.537236858 +0000 UTC m=+6073.708096501" watchObservedRunningTime="2025-12-02 09:04:22.543382458 +0000 UTC m=+6073.714242071" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.556001 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.565569 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b776dc549-zrcws" podStartSLOduration=2.800576208 podStartE2EDuration="10.565546238s" podCreationTimestamp="2025-12-02 09:04:12 +0000 UTC" firstStartedPulling="2025-12-02 09:04:13.474031129 +0000 UTC m=+6064.644890742" lastFinishedPulling="2025-12-02 09:04:21.239001159 +0000 UTC m=+6072.409860772" observedRunningTime="2025-12-02 09:04:22.56077642 +0000 UTC m=+6073.731636053" watchObservedRunningTime="2025-12-02 09:04:22.565546238 +0000 UTC m=+6073.736405841" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.585351 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76bcdfd5df-n7r77" podStartSLOduration=3.308987386 podStartE2EDuration="10.585329523s" podCreationTimestamp="2025-12-02 09:04:12 +0000 UTC" firstStartedPulling="2025-12-02 09:04:13.914374829 +0000 UTC m=+6065.085234442" lastFinishedPulling="2025-12-02 09:04:21.190716966 +0000 UTC m=+6072.361576579" observedRunningTime="2025-12-02 09:04:22.583498817 +0000 UTC m=+6073.754358450" watchObservedRunningTime="2025-12-02 09:04:22.585329523 +0000 UTC m=+6073.756189136" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.698398 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.698435 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.736530 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 09:04:22 crc kubenswrapper[4895]: I1202 09:04:22.861782 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.321361 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6" path="/var/lib/kubelet/pods/2d9c01c8-fbe2-4c99-9c4a-edc0560aebd6/volumes" Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.322597 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773bc693-f07d-4938-8980-3099a7dbc5dd" path="/var/lib/kubelet/pods/773bc693-f07d-4938-8980-3099a7dbc5dd/volumes" Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.367831 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.367871 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.562591 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f19bb559-c258-498f-9132-7ee9ea57db14","Type":"ContainerStarted","Data":"224f8b6dc7bddd7a289b0ea2a1faa3f29cb0a9a72d6d7191588da13b148cfc90"} Dec 02 09:04:23 crc kubenswrapper[4895]: I1202 09:04:23.564868 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f09559d-29ed-400b-8069-4684c4d060cd","Type":"ContainerStarted","Data":"deb6b1f5c4d6c3b31243dd1f92a920804d522db7d55cfc2111654b01fc70e882"} Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.646262 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f19bb559-c258-498f-9132-7ee9ea57db14","Type":"ContainerStarted","Data":"28edac28904859d71d26042dc4633ecdb236095f4a402c30f269fb2371910a16"} Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.693924 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f19bb559-c258-498f-9132-7ee9ea57db14","Type":"ContainerStarted","Data":"f3e8e63706b80c7b65a303d538e99f1b62ba1c3d44ebd85a28166a8f65dad86f"} Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.700274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f09559d-29ed-400b-8069-4684c4d060cd","Type":"ContainerStarted","Data":"09dab330f7e5fdb393fd1c2889791063fd3defef7bae53f9a9da9e3127615e3e"} Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.700592 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2f09559d-29ed-400b-8069-4684c4d060cd","Type":"ContainerStarted","Data":"6c6a0a8fbb600a34440e3a0a147900d2d5a77ba4e706dc2b6de81f30125731fe"} Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.738002 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.737874289 podStartE2EDuration="3.737874289s" podCreationTimestamp="2025-12-02 09:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:04:24.719106956 +0000 UTC m=+6075.889966589" watchObservedRunningTime="2025-12-02 09:04:24.737874289 +0000 UTC m=+6075.908733902" Dec 02 09:04:24 crc kubenswrapper[4895]: I1202 09:04:24.760860 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.760818263 podStartE2EDuration="3.760818263s" podCreationTimestamp="2025-12-02 09:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:04:24.754091424 +0000 UTC m=+6075.924951057" watchObservedRunningTime="2025-12-02 09:04:24.760818263 +0000 UTC m=+6075.931677876" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.379821 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.382720 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.387761 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.480665 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.480876 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.480941 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjpg\" (UniqueName: \"kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.596686 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.601839 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjpg\" (UniqueName: \"kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.602170 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.602235 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.603780 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.624423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjpg\" (UniqueName: \"kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg\") pod \"redhat-marketplace-bgmlh\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:28 crc kubenswrapper[4895]: I1202 09:04:28.719049 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:29 crc kubenswrapper[4895]: I1202 09:04:29.398241 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:29 crc kubenswrapper[4895]: I1202 09:04:29.770699 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerStarted","Data":"a8e756eb8c0edfd9d42c9efb2f2213d6106c8c2db7ef61c4d7f911523cc94a1d"} Dec 02 09:04:30 crc kubenswrapper[4895]: I1202 09:04:30.783295 4895 generic.go:334] "Generic (PLEG): container finished" podID="d58f5900-18bf-409f-bba9-69b7d491582d" containerID="ffb8c3f327c58c3b79b5abd28ff69b1d937522a00f8314b2424934de17ccbb01" exitCode=0 Dec 02 09:04:30 crc kubenswrapper[4895]: I1202 09:04:30.783413 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerDied","Data":"ffb8c3f327c58c3b79b5abd28ff69b1d937522a00f8314b2424934de17ccbb01"} Dec 02 09:04:30 crc kubenswrapper[4895]: I1202 09:04:30.813407 4895 scope.go:117] "RemoveContainer" containerID="f95b461efe7c0e15bbf698e725cf1b5d926c2c7e79046120926734bf8421fe6b" Dec 02 09:04:30 crc kubenswrapper[4895]: I1202 09:04:30.843600 4895 scope.go:117] "RemoveContainer" containerID="811d80e26c656246a54363e16a23b46646e3fff64dd86dcd033c54b8f9248b29" Dec 02 09:04:30 crc kubenswrapper[4895]: I1202 09:04:30.894927 4895 scope.go:117] "RemoveContainer" containerID="3ee877cee51e74863abd814c7c20c27274c71190aa6fc06006ba34827c0c6ec2" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.045285 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.045884 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.089069 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.106520 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.106600 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.113452 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.169736 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.177213 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: E1202 09:04:32.584399 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58f5900_18bf_409f_bba9_69b7d491582d.slice/crio-conmon-ba7ea777dfb9e5d2c4e6ef80b6831ed7d3249cee716e2e5bcb66769bd5a226c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd58f5900_18bf_409f_bba9_69b7d491582d.slice/crio-ba7ea777dfb9e5d2c4e6ef80b6831ed7d3249cee716e2e5bcb66769bd5a226c1.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.700184 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.833968 4895 generic.go:334] "Generic (PLEG): container finished" podID="d58f5900-18bf-409f-bba9-69b7d491582d" containerID="ba7ea777dfb9e5d2c4e6ef80b6831ed7d3249cee716e2e5bcb66769bd5a226c1" exitCode=0 Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.834343 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerDied","Data":"ba7ea777dfb9e5d2c4e6ef80b6831ed7d3249cee716e2e5bcb66769bd5a226c1"} Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.834935 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.835316 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.835427 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 09:04:32 crc kubenswrapper[4895]: I1202 09:04:32.835492 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:33 crc kubenswrapper[4895]: I1202 09:04:33.372072 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 02 09:04:34 crc kubenswrapper[4895]: I1202 09:04:34.046237 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qj5cd"] Dec 02 09:04:34 crc kubenswrapper[4895]: I1202 09:04:34.057138 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qj5cd"] Dec 02 09:04:34 crc kubenswrapper[4895]: I1202 09:04:34.859106 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:04:34 crc kubenswrapper[4895]: I1202 09:04:34.859504 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.041705 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-323e-account-create-update-w4q2t"] Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.055433 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-323e-account-create-update-w4q2t"] Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.184280 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503a1114-4530-4147-950a-efa451c46545" path="/var/lib/kubelet/pods/503a1114-4530-4147-950a-efa451c46545/volumes" Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.185685 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c10dc0-d131-46da-b015-2f1dc1843723" path="/var/lib/kubelet/pods/f1c10dc0-d131-46da-b015-2f1dc1843723/volumes" Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.911914 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:04:35 crc kubenswrapper[4895]: I1202 09:04:35.912655 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.100325 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.100552 4895 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.256446 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.344610 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.905950 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerStarted","Data":"9e23f492228ea273be624d68bb7ce903d74bf9d123b5b0cd797c3779ac128820"} Dec 02 09:04:36 crc kubenswrapper[4895]: I1202 09:04:36.932541 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bgmlh" podStartSLOduration=3.931587837 podStartE2EDuration="8.932511857s" podCreationTimestamp="2025-12-02 09:04:28 +0000 UTC" firstStartedPulling="2025-12-02 09:04:30.786344979 +0000 UTC m=+6081.957204582" lastFinishedPulling="2025-12-02 09:04:35.787268989 +0000 UTC m=+6086.958128602" observedRunningTime="2025-12-02 09:04:36.926912733 +0000 UTC m=+6088.097772346" watchObservedRunningTime="2025-12-02 09:04:36.932511857 +0000 UTC m=+6088.103371470" Dec 02 09:04:38 crc kubenswrapper[4895]: I1202 09:04:38.720026 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:38 crc kubenswrapper[4895]: I1202 09:04:38.720702 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:38 crc kubenswrapper[4895]: I1202 09:04:38.776152 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:42 crc kubenswrapper[4895]: I1202 09:04:42.698938 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 02 09:04:43 crc kubenswrapper[4895]: I1202 09:04:43.066088 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z2sln"] Dec 02 09:04:43 crc kubenswrapper[4895]: I1202 09:04:43.081168 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z2sln"] Dec 02 09:04:43 crc kubenswrapper[4895]: I1202 09:04:43.184583 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00cc999-01c0-4d79-870c-4e84aff41706" path="/var/lib/kubelet/pods/a00cc999-01c0-4d79-870c-4e84aff41706/volumes" Dec 02 09:04:43 crc kubenswrapper[4895]: I1202 09:04:43.368398 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 02 09:04:48 crc kubenswrapper[4895]: I1202 09:04:48.772640 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.727292 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.730054 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.749960 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.775897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw9gs\" (UniqueName: \"kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.775992 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.776768 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.878622 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw9gs\" (UniqueName: \"kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.878694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.878801 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.879338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.879393 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:49 crc kubenswrapper[4895]: I1202 09:04:49.899147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw9gs\" (UniqueName: \"kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs\") pod \"certified-operators-686qz\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:50 crc kubenswrapper[4895]: I1202 09:04:50.060719 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:04:50 crc kubenswrapper[4895]: I1202 09:04:50.580226 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:04:50 crc kubenswrapper[4895]: W1202 09:04:50.597586 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ab8a10_4fcb_425d_b1b6_6770df4e13ec.slice/crio-5363c23b03379d2410aa027f88e3eccf24cb4f4e979d2096edcb6b23c651e750 WatchSource:0}: Error finding container 5363c23b03379d2410aa027f88e3eccf24cb4f4e979d2096edcb6b23c651e750: Status 404 returned error can't find the container with id 5363c23b03379d2410aa027f88e3eccf24cb4f4e979d2096edcb6b23c651e750 Dec 02 09:04:51 crc kubenswrapper[4895]: I1202 09:04:51.066288 4895 generic.go:334] "Generic (PLEG): container finished" podID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerID="1882208e4ec582e7230252730b95fc481ee16a4a1aa9f0a0c7efdb3f34ceb0b1" exitCode=0 Dec 02 09:04:51 crc kubenswrapper[4895]: I1202 09:04:51.066333 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerDied","Data":"1882208e4ec582e7230252730b95fc481ee16a4a1aa9f0a0c7efdb3f34ceb0b1"} Dec 02 09:04:51 crc kubenswrapper[4895]: I1202 09:04:51.066364 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerStarted","Data":"5363c23b03379d2410aa027f88e3eccf24cb4f4e979d2096edcb6b23c651e750"} Dec 02 09:04:52 crc kubenswrapper[4895]: I1202 09:04:52.088109 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerStarted","Data":"dbf1db72d64270f79aa30af01740890f1b1adbc6a071eaf306de754269956a2d"} Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.104140 4895 generic.go:334] "Generic (PLEG): container finished" podID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerID="dbf1db72d64270f79aa30af01740890f1b1adbc6a071eaf306de754269956a2d" exitCode=0 Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.104654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerDied","Data":"dbf1db72d64270f79aa30af01740890f1b1adbc6a071eaf306de754269956a2d"} Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.109391 4895 generic.go:334] "Generic (PLEG): container finished" podID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerID="fa8729f512b9c8272b69f5522c401d70e95abfc6e33cd40ed9723c82f3339531" exitCode=137 Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.109425 4895 generic.go:334] "Generic (PLEG): container finished" podID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerID="e93438b53e54936015836510cdc6cb9b32305274e5cc526bd00283525667c6c5" exitCode=137 Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.109444 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerDied","Data":"fa8729f512b9c8272b69f5522c401d70e95abfc6e33cd40ed9723c82f3339531"} Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.109467 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerDied","Data":"e93438b53e54936015836510cdc6cb9b32305274e5cc526bd00283525667c6c5"} Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.948629 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.963430 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z48xd\" (UniqueName: \"kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd\") pod \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.963492 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs\") pod \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.963532 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data\") pod \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.963739 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts\") pod \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.963852 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key\") pod \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\" (UID: \"7fe5d716-e549-4eb1-9bff-83a0afed16c2\") " Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.966246 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs" (OuterVolumeSpecName: "logs") pod "7fe5d716-e549-4eb1-9bff-83a0afed16c2" (UID: "7fe5d716-e549-4eb1-9bff-83a0afed16c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.980881 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7fe5d716-e549-4eb1-9bff-83a0afed16c2" (UID: "7fe5d716-e549-4eb1-9bff-83a0afed16c2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:04:53 crc kubenswrapper[4895]: I1202 09:04:53.981421 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd" (OuterVolumeSpecName: "kube-api-access-z48xd") pod "7fe5d716-e549-4eb1-9bff-83a0afed16c2" (UID: "7fe5d716-e549-4eb1-9bff-83a0afed16c2"). InnerVolumeSpecName "kube-api-access-z48xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.042216 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts" (OuterVolumeSpecName: "scripts") pod "7fe5d716-e549-4eb1-9bff-83a0afed16c2" (UID: "7fe5d716-e549-4eb1-9bff-83a0afed16c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.065405 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d716-e549-4eb1-9bff-83a0afed16c2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.065445 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.065457 4895 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7fe5d716-e549-4eb1-9bff-83a0afed16c2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.065471 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z48xd\" (UniqueName: \"kubernetes.io/projected/7fe5d716-e549-4eb1-9bff-83a0afed16c2-kube-api-access-z48xd\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.073610 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data" (OuterVolumeSpecName: "config-data") pod "7fe5d716-e549-4eb1-9bff-83a0afed16c2" (UID: "7fe5d716-e549-4eb1-9bff-83a0afed16c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.138864 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerStarted","Data":"9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c"} Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.141281 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68f4ccf997-tvwxr" event={"ID":"7fe5d716-e549-4eb1-9bff-83a0afed16c2","Type":"ContainerDied","Data":"23240f355d8ff9165b633599479172475c8f23ce7f8ccba0a94d8246f19a8ae6"} Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.141362 4895 scope.go:117] "RemoveContainer" containerID="fa8729f512b9c8272b69f5522c401d70e95abfc6e33cd40ed9723c82f3339531" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.141614 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68f4ccf997-tvwxr" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.167000 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7fe5d716-e549-4eb1-9bff-83a0afed16c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.178462 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-686qz" podStartSLOduration=2.541928528 podStartE2EDuration="5.178356569s" podCreationTimestamp="2025-12-02 09:04:49 +0000 UTC" firstStartedPulling="2025-12-02 09:04:51.068268793 +0000 UTC m=+6102.239128406" lastFinishedPulling="2025-12-02 09:04:53.704696834 +0000 UTC m=+6104.875556447" observedRunningTime="2025-12-02 09:04:54.160478083 +0000 UTC m=+6105.331337706" watchObservedRunningTime="2025-12-02 09:04:54.178356569 +0000 UTC m=+6105.349216182" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.202350 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.214776 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68f4ccf997-tvwxr"] Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.366199 4895 scope.go:117] "RemoveContainer" containerID="e93438b53e54936015836510cdc6cb9b32305274e5cc526bd00283525667c6c5" Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.926067 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:54 crc kubenswrapper[4895]: I1202 09:04:54.926378 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bgmlh" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="registry-server" containerID="cri-o://9e23f492228ea273be624d68bb7ce903d74bf9d123b5b0cd797c3779ac128820" gracePeriod=2 Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.162410 4895 generic.go:334] "Generic (PLEG): container finished" podID="d58f5900-18bf-409f-bba9-69b7d491582d" containerID="9e23f492228ea273be624d68bb7ce903d74bf9d123b5b0cd797c3779ac128820" exitCode=0 Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.162563 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" path="/var/lib/kubelet/pods/7fe5d716-e549-4eb1-9bff-83a0afed16c2/volumes" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.164077 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerDied","Data":"9e23f492228ea273be624d68bb7ce903d74bf9d123b5b0cd797c3779ac128820"} Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.412937 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.580500 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities\") pod \"d58f5900-18bf-409f-bba9-69b7d491582d\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.580574 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkjpg\" (UniqueName: \"kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg\") pod \"d58f5900-18bf-409f-bba9-69b7d491582d\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.580654 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content\") pod \"d58f5900-18bf-409f-bba9-69b7d491582d\" (UID: \"d58f5900-18bf-409f-bba9-69b7d491582d\") " Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.581270 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities" (OuterVolumeSpecName: "utilities") pod "d58f5900-18bf-409f-bba9-69b7d491582d" (UID: "d58f5900-18bf-409f-bba9-69b7d491582d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.588047 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg" (OuterVolumeSpecName: "kube-api-access-wkjpg") pod "d58f5900-18bf-409f-bba9-69b7d491582d" (UID: "d58f5900-18bf-409f-bba9-69b7d491582d"). InnerVolumeSpecName "kube-api-access-wkjpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.598757 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d58f5900-18bf-409f-bba9-69b7d491582d" (UID: "d58f5900-18bf-409f-bba9-69b7d491582d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.683379 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.683434 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkjpg\" (UniqueName: \"kubernetes.io/projected/d58f5900-18bf-409f-bba9-69b7d491582d-kube-api-access-wkjpg\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.683449 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d58f5900-18bf-409f-bba9-69b7d491582d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:04:55 crc kubenswrapper[4895]: I1202 09:04:55.878480 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.180622 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgmlh" event={"ID":"d58f5900-18bf-409f-bba9-69b7d491582d","Type":"ContainerDied","Data":"a8e756eb8c0edfd9d42c9efb2f2213d6106c8c2db7ef61c4d7f911523cc94a1d"} Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.181658 4895 scope.go:117] "RemoveContainer" containerID="9e23f492228ea273be624d68bb7ce903d74bf9d123b5b0cd797c3779ac128820" Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.180708 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgmlh" Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.217908 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.230981 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.241985 4895 scope.go:117] "RemoveContainer" containerID="ba7ea777dfb9e5d2c4e6ef80b6831ed7d3249cee716e2e5bcb66769bd5a226c1" Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.243830 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgmlh"] Dec 02 09:04:56 crc kubenswrapper[4895]: I1202 09:04:56.284794 4895 scope.go:117] "RemoveContainer" containerID="ffb8c3f327c58c3b79b5abd28ff69b1d937522a00f8314b2424934de17ccbb01" Dec 02 09:04:57 crc kubenswrapper[4895]: I1202 09:04:57.157181 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" path="/var/lib/kubelet/pods/d58f5900-18bf-409f-bba9-69b7d491582d/volumes" Dec 02 09:04:57 crc kubenswrapper[4895]: I1202 09:04:57.888397 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:04:58 crc kubenswrapper[4895]: I1202 09:04:58.465203 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:04:58 crc kubenswrapper[4895]: I1202 09:04:58.545101 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:04:58 crc kubenswrapper[4895]: I1202 09:04:58.545379 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon-log" containerID="cri-o://5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216" gracePeriod=30 Dec 02 09:04:58 crc kubenswrapper[4895]: I1202 09:04:58.548948 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" containerID="cri-o://2b684b253b769194b8612cbf5954882e0ee204e9fd5fd536a695dc7c9daece5f" gracePeriod=30 Dec 02 09:05:00 crc kubenswrapper[4895]: I1202 09:05:00.061848 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:00 crc kubenswrapper[4895]: I1202 09:05:00.062808 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:00 crc kubenswrapper[4895]: I1202 09:05:00.112411 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:00 crc kubenswrapper[4895]: I1202 09:05:00.512299 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:02 crc kubenswrapper[4895]: I1202 09:05:02.453899 4895 generic.go:334] "Generic (PLEG): container finished" podID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerID="2b684b253b769194b8612cbf5954882e0ee204e9fd5fd536a695dc7c9daece5f" exitCode=0 Dec 02 09:05:02 crc kubenswrapper[4895]: I1202 09:05:02.453933 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerDied","Data":"2b684b253b769194b8612cbf5954882e0ee204e9fd5fd536a695dc7c9daece5f"} Dec 02 09:05:02 crc kubenswrapper[4895]: I1202 09:05:02.699052 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.325011 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.325584 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-686qz" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="registry-server" containerID="cri-o://9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c" gracePeriod=2 Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.464772 4895 generic.go:334] "Generic (PLEG): container finished" podID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerID="9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c" exitCode=0 Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.464830 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerDied","Data":"9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c"} Dec 02 09:05:03 crc kubenswrapper[4895]: E1202 09:05:03.565480 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ab8a10_4fcb_425d_b1b6_6770df4e13ec.slice/crio-conmon-9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ab8a10_4fcb_425d_b1b6_6770df4e13ec.slice/crio-9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.801921 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.928845 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content\") pod \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.928954 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities\") pod \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.928996 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw9gs\" (UniqueName: \"kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs\") pod \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\" (UID: \"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec\") " Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.930132 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities" (OuterVolumeSpecName: "utilities") pod "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" (UID: "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.942997 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs" (OuterVolumeSpecName: "kube-api-access-gw9gs") pod "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" (UID: "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec"). InnerVolumeSpecName "kube-api-access-gw9gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:05:03 crc kubenswrapper[4895]: I1202 09:05:03.981657 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" (UID: "b1ab8a10-4fcb-425d-b1b6-6770df4e13ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.031532 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.031654 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.031666 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw9gs\" (UniqueName: \"kubernetes.io/projected/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec-kube-api-access-gw9gs\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.507758 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-686qz" event={"ID":"b1ab8a10-4fcb-425d-b1b6-6770df4e13ec","Type":"ContainerDied","Data":"5363c23b03379d2410aa027f88e3eccf24cb4f4e979d2096edcb6b23c651e750"} Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.507868 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-686qz" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.508113 4895 scope.go:117] "RemoveContainer" containerID="9bdfdf62a356bdb12b6a72d3f68c102cc890b876460bb281c0d629c329ba4e7c" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.538826 4895 scope.go:117] "RemoveContainer" containerID="dbf1db72d64270f79aa30af01740890f1b1adbc6a071eaf306de754269956a2d" Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.555572 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.564912 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-686qz"] Dec 02 09:05:04 crc kubenswrapper[4895]: I1202 09:05:04.567082 4895 scope.go:117] "RemoveContainer" containerID="1882208e4ec582e7230252730b95fc481ee16a4a1aa9f0a0c7efdb3f34ceb0b1" Dec 02 09:05:05 crc kubenswrapper[4895]: I1202 09:05:05.154261 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" path="/var/lib/kubelet/pods/b1ab8a10-4fcb-425d-b1b6-6770df4e13ec/volumes" Dec 02 09:05:12 crc kubenswrapper[4895]: I1202 09:05:12.699084 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 02 09:05:22 crc kubenswrapper[4895]: I1202 09:05:22.698875 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b776dc549-zrcws" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.110:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.110:8080: connect: connection refused" Dec 02 09:05:22 crc kubenswrapper[4895]: I1202 09:05:22.699632 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.046625 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-m8rvv"] Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.059711 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4006-account-create-update-xvvqf"] Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.076555 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4006-account-create-update-xvvqf"] Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.085194 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-m8rvv"] Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.156733 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f8b53e-94ed-4cd6-a863-c78912879a7f" path="/var/lib/kubelet/pods/03f8b53e-94ed-4cd6-a863-c78912879a7f/volumes" Dec 02 09:05:25 crc kubenswrapper[4895]: I1202 09:05:25.157610 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38805e40-5f7b-4d9d-91fc-e70e17f03233" path="/var/lib/kubelet/pods/38805e40-5f7b-4d9d-91fc-e70e17f03233/volumes" Dec 02 09:05:28 crc kubenswrapper[4895]: I1202 09:05:28.756726 4895 generic.go:334] "Generic (PLEG): container finished" podID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerID="5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216" exitCode=137 Dec 02 09:05:28 crc kubenswrapper[4895]: I1202 09:05:28.756786 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerDied","Data":"5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216"} Dec 02 09:05:28 crc kubenswrapper[4895]: E1202 09:05:28.857654 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099f2897_7daf_4053_83a3_caacbf2ea78a.slice/crio-conmon-5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:05:28 crc kubenswrapper[4895]: I1202 09:05:28.941168 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.120788 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs\") pod \"099f2897-7daf-4053-83a3-caacbf2ea78a\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.120860 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz8vh\" (UniqueName: \"kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh\") pod \"099f2897-7daf-4053-83a3-caacbf2ea78a\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.121012 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data\") pod \"099f2897-7daf-4053-83a3-caacbf2ea78a\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.121052 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key\") pod \"099f2897-7daf-4053-83a3-caacbf2ea78a\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.121114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts\") pod \"099f2897-7daf-4053-83a3-caacbf2ea78a\" (UID: \"099f2897-7daf-4053-83a3-caacbf2ea78a\") " Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.121318 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs" (OuterVolumeSpecName: "logs") pod "099f2897-7daf-4053-83a3-caacbf2ea78a" (UID: "099f2897-7daf-4053-83a3-caacbf2ea78a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.121949 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/099f2897-7daf-4053-83a3-caacbf2ea78a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.127881 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh" (OuterVolumeSpecName: "kube-api-access-jz8vh") pod "099f2897-7daf-4053-83a3-caacbf2ea78a" (UID: "099f2897-7daf-4053-83a3-caacbf2ea78a"). InnerVolumeSpecName "kube-api-access-jz8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.128138 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "099f2897-7daf-4053-83a3-caacbf2ea78a" (UID: "099f2897-7daf-4053-83a3-caacbf2ea78a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.151827 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts" (OuterVolumeSpecName: "scripts") pod "099f2897-7daf-4053-83a3-caacbf2ea78a" (UID: "099f2897-7daf-4053-83a3-caacbf2ea78a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.158842 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data" (OuterVolumeSpecName: "config-data") pod "099f2897-7daf-4053-83a3-caacbf2ea78a" (UID: "099f2897-7daf-4053-83a3-caacbf2ea78a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.223887 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz8vh\" (UniqueName: \"kubernetes.io/projected/099f2897-7daf-4053-83a3-caacbf2ea78a-kube-api-access-jz8vh\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.223916 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.223927 4895 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/099f2897-7daf-4053-83a3-caacbf2ea78a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.223937 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/099f2897-7daf-4053-83a3-caacbf2ea78a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.772132 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b776dc549-zrcws" event={"ID":"099f2897-7daf-4053-83a3-caacbf2ea78a","Type":"ContainerDied","Data":"511acb7a55475c9bc82ad3767cb0dd3a4b6ebc895ee79a407926fbcfd9b62ec4"} Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.772205 4895 scope.go:117] "RemoveContainer" containerID="2b684b253b769194b8612cbf5954882e0ee204e9fd5fd536a695dc7c9daece5f" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.772227 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b776dc549-zrcws" Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.814722 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.825392 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b776dc549-zrcws"] Dec 02 09:05:29 crc kubenswrapper[4895]: I1202 09:05:29.976994 4895 scope.go:117] "RemoveContainer" containerID="5fa13d0b187c0a91d12a149222998075c3a44e64c312ee6f4519212d4ffe6216" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.075560 4895 scope.go:117] "RemoveContainer" containerID="021e0aeb1985d12cd5381159443f360933eadaed7bc26a2ccaa8c0e3e9cdc374" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.105642 4895 scope.go:117] "RemoveContainer" containerID="33fa1fce993d47df6ec702c7fc2ed2eaf36ab13d08ddef01a3555be0b0638d61" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.160102 4895 scope.go:117] "RemoveContainer" containerID="cab76992dee572ef2fccea4bd55f2f55d7beee4f58370f7d39c4e77688a8710d" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.163727 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" path="/var/lib/kubelet/pods/099f2897-7daf-4053-83a3-caacbf2ea78a/volumes" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.224723 4895 scope.go:117] "RemoveContainer" containerID="479aa7736c632ffde4307b61d4f05e5599c0104a4fe2368c24a895db3c4bf37d" Dec 02 09:05:31 crc kubenswrapper[4895]: I1202 09:05:31.263246 4895 scope.go:117] "RemoveContainer" containerID="710da3364236afd6e2492aebc51c183d4b941b4e773510efb788e163681e1435" Dec 02 09:05:34 crc kubenswrapper[4895]: I1202 09:05:34.044863 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zmtns"] Dec 02 09:05:34 crc kubenswrapper[4895]: I1202 09:05:34.057784 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zmtns"] Dec 02 09:05:35 crc kubenswrapper[4895]: I1202 09:05:35.159454 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49929bec-a95b-40ab-b9bd-b162ecbee391" path="/var/lib/kubelet/pods/49929bec-a95b-40ab-b9bd-b162ecbee391/volumes" Dec 02 09:05:35 crc kubenswrapper[4895]: I1202 09:05:35.473752 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:05:35 crc kubenswrapper[4895]: I1202 09:05:35.473808 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:06:03 crc kubenswrapper[4895]: I1202 09:06:03.055719 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-nmzg7"] Dec 02 09:06:03 crc kubenswrapper[4895]: I1202 09:06:03.065454 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-nmzg7"] Dec 02 09:06:03 crc kubenswrapper[4895]: I1202 09:06:03.155498 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59d4c29-3b2b-4966-afcf-beb3f0ea1502" path="/var/lib/kubelet/pods/c59d4c29-3b2b-4966-afcf-beb3f0ea1502/volumes" Dec 02 09:06:04 crc kubenswrapper[4895]: I1202 09:06:04.031810 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3f33-account-create-update-rlk7v"] Dec 02 09:06:04 crc kubenswrapper[4895]: I1202 09:06:04.042948 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3f33-account-create-update-rlk7v"] Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088138 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66c85bfb6f-gnhhn"] Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088561 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="extract-utilities" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088589 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="extract-utilities" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088603 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088611 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088628 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="extract-content" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088635 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="extract-content" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088650 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="extract-content" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088655 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="extract-content" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088669 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="extract-utilities" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088676 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="extract-utilities" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088688 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088696 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088706 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088713 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088724 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088732 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088767 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088776 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: E1202 09:06:05.088791 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088797 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.088984 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.089009 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58f5900-18bf-409f-bba9-69b7d491582d" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.089018 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ab8a10-4fcb-425d-b1b6-6770df4e13ec" containerName="registry-server" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.089029 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.089039 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="099f2897-7daf-4053-83a3-caacbf2ea78a" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.089055 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe5d716-e549-4eb1-9bff-83a0afed16c2" containerName="horizon-log" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.090124 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.113862 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66c85bfb6f-gnhhn"] Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.162061 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dca9eb0-0064-474c-864d-2a3ee5a37609" path="/var/lib/kubelet/pods/1dca9eb0-0064-474c-864d-2a3ee5a37609/volumes" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.267563 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77721628-6ade-42aa-bce2-e4d481d4d76f-logs\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.268096 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-scripts\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.268140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-config-data\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.268184 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77721628-6ade-42aa-bce2-e4d481d4d76f-horizon-secret-key\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.268407 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5jl4\" (UniqueName: \"kubernetes.io/projected/77721628-6ade-42aa-bce2-e4d481d4d76f-kube-api-access-j5jl4\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.370874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-config-data\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.370982 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77721628-6ade-42aa-bce2-e4d481d4d76f-horizon-secret-key\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.371043 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jl4\" (UniqueName: \"kubernetes.io/projected/77721628-6ade-42aa-bce2-e4d481d4d76f-kube-api-access-j5jl4\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.371112 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77721628-6ade-42aa-bce2-e4d481d4d76f-logs\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.371211 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-scripts\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.372051 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-scripts\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.372553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77721628-6ade-42aa-bce2-e4d481d4d76f-config-data\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.373328 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77721628-6ade-42aa-bce2-e4d481d4d76f-logs\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.386304 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77721628-6ade-42aa-bce2-e4d481d4d76f-horizon-secret-key\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.389342 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jl4\" (UniqueName: \"kubernetes.io/projected/77721628-6ade-42aa-bce2-e4d481d4d76f-kube-api-access-j5jl4\") pod \"horizon-66c85bfb6f-gnhhn\" (UID: \"77721628-6ade-42aa-bce2-e4d481d4d76f\") " pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.416955 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.473164 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.473244 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:06:05 crc kubenswrapper[4895]: I1202 09:06:05.887428 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66c85bfb6f-gnhhn"] Dec 02 09:06:05 crc kubenswrapper[4895]: W1202 09:06:05.889861 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77721628_6ade_42aa_bce2_e4d481d4d76f.slice/crio-4c82a0047e46b522a18a275e00680b072721d8aa39f007c53d6a399a4b5004e1 WatchSource:0}: Error finding container 4c82a0047e46b522a18a275e00680b072721d8aa39f007c53d6a399a4b5004e1: Status 404 returned error can't find the container with id 4c82a0047e46b522a18a275e00680b072721d8aa39f007c53d6a399a4b5004e1 Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.127824 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c85bfb6f-gnhhn" event={"ID":"77721628-6ade-42aa-bce2-e4d481d4d76f","Type":"ContainerStarted","Data":"0c8349343621a37944478e3155133ae744017277b37612cdbf6c7d77909f93a0"} Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.128238 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c85bfb6f-gnhhn" event={"ID":"77721628-6ade-42aa-bce2-e4d481d4d76f","Type":"ContainerStarted","Data":"4c82a0047e46b522a18a275e00680b072721d8aa39f007c53d6a399a4b5004e1"} Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.368179 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-r2hxs"] Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.369715 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.380092 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-r2hxs"] Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.481410 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-643c-account-create-update-tcbmr"] Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.485603 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.490056 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.512291 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2sr\" (UniqueName: \"kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.512429 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.512730 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-643c-account-create-update-tcbmr"] Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.614655 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2sr\" (UniqueName: \"kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.614719 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.614809 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.614885 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxz95\" (UniqueName: \"kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.615915 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.634691 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2sr\" (UniqueName: \"kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr\") pod \"heat-db-create-r2hxs\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.716846 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxz95\" (UniqueName: \"kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.717007 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.717847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.735573 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxz95\" (UniqueName: \"kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95\") pod \"heat-643c-account-create-update-tcbmr\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.747370 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:06 crc kubenswrapper[4895]: I1202 09:06:06.812430 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:07 crc kubenswrapper[4895]: I1202 09:06:07.136688 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66c85bfb6f-gnhhn" event={"ID":"77721628-6ade-42aa-bce2-e4d481d4d76f","Type":"ContainerStarted","Data":"358288e180286480f17558095ed4fda6a3d29fba8facfdef75a534f8df35349b"} Dec 02 09:06:07 crc kubenswrapper[4895]: I1202 09:06:07.163122 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66c85bfb6f-gnhhn" podStartSLOduration=2.163099773 podStartE2EDuration="2.163099773s" podCreationTimestamp="2025-12-02 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:06:07.159215492 +0000 UTC m=+6178.330075105" watchObservedRunningTime="2025-12-02 09:06:07.163099773 +0000 UTC m=+6178.333959386" Dec 02 09:06:07 crc kubenswrapper[4895]: W1202 09:06:07.258212 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode655f0bd_0705_4b88_b0c8_d9c184c96f99.slice/crio-12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28 WatchSource:0}: Error finding container 12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28: Status 404 returned error can't find the container with id 12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28 Dec 02 09:06:07 crc kubenswrapper[4895]: I1202 09:06:07.259182 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-r2hxs"] Dec 02 09:06:07 crc kubenswrapper[4895]: W1202 09:06:07.326496 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03d8163d_5dd1_428e_9b6c_ab2eeaa5a010.slice/crio-35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6 WatchSource:0}: Error finding container 35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6: Status 404 returned error can't find the container with id 35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6 Dec 02 09:06:07 crc kubenswrapper[4895]: I1202 09:06:07.329068 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-643c-account-create-update-tcbmr"] Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.154658 4895 generic.go:334] "Generic (PLEG): container finished" podID="e655f0bd-0705-4b88-b0c8-d9c184c96f99" containerID="47e7bd5c4c676f0ff821cbaa0d0e797e4abe7b4e57380d493827b73c46f73b70" exitCode=0 Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.154853 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-r2hxs" event={"ID":"e655f0bd-0705-4b88-b0c8-d9c184c96f99","Type":"ContainerDied","Data":"47e7bd5c4c676f0ff821cbaa0d0e797e4abe7b4e57380d493827b73c46f73b70"} Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.155162 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-r2hxs" event={"ID":"e655f0bd-0705-4b88-b0c8-d9c184c96f99","Type":"ContainerStarted","Data":"12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28"} Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.160583 4895 generic.go:334] "Generic (PLEG): container finished" podID="03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" containerID="9b13260e87d8a484148a6bf45a00d1cb6889f8c80e081413d6c1b7f1bcfb22e5" exitCode=0 Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.160825 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-643c-account-create-update-tcbmr" event={"ID":"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010","Type":"ContainerDied","Data":"9b13260e87d8a484148a6bf45a00d1cb6889f8c80e081413d6c1b7f1bcfb22e5"} Dec 02 09:06:08 crc kubenswrapper[4895]: I1202 09:06:08.160884 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-643c-account-create-update-tcbmr" event={"ID":"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010","Type":"ContainerStarted","Data":"35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6"} Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.046943 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h7zhh"] Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.069815 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h7zhh"] Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.160650 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1ce383-de6c-4845-8aa1-d97f8057fd90" path="/var/lib/kubelet/pods/1f1ce383-de6c-4845-8aa1-d97f8057fd90/volumes" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.656020 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.663621 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.808382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxz95\" (UniqueName: \"kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95\") pod \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.808453 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts\") pod \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\" (UID: \"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010\") " Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.808522 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2sr\" (UniqueName: \"kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr\") pod \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.808552 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts\") pod \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\" (UID: \"e655f0bd-0705-4b88-b0c8-d9c184c96f99\") " Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.809154 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" (UID: "03d8163d-5dd1-428e-9b6c-ab2eeaa5a010"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.809397 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e655f0bd-0705-4b88-b0c8-d9c184c96f99" (UID: "e655f0bd-0705-4b88-b0c8-d9c184c96f99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.809436 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.816566 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95" (OuterVolumeSpecName: "kube-api-access-dxz95") pod "03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" (UID: "03d8163d-5dd1-428e-9b6c-ab2eeaa5a010"). InnerVolumeSpecName "kube-api-access-dxz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.837301 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr" (OuterVolumeSpecName: "kube-api-access-hv2sr") pod "e655f0bd-0705-4b88-b0c8-d9c184c96f99" (UID: "e655f0bd-0705-4b88-b0c8-d9c184c96f99"). InnerVolumeSpecName "kube-api-access-hv2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.913672 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxz95\" (UniqueName: \"kubernetes.io/projected/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010-kube-api-access-dxz95\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.913753 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2sr\" (UniqueName: \"kubernetes.io/projected/e655f0bd-0705-4b88-b0c8-d9c184c96f99-kube-api-access-hv2sr\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:09 crc kubenswrapper[4895]: I1202 09:06:09.913804 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e655f0bd-0705-4b88-b0c8-d9c184c96f99-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.185903 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-643c-account-create-update-tcbmr" event={"ID":"03d8163d-5dd1-428e-9b6c-ab2eeaa5a010","Type":"ContainerDied","Data":"35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6"} Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.185954 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-643c-account-create-update-tcbmr" Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.185961 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b7ffa31c4c9be97e5c52f94a6b2214979c26974ef6c2823f2e312b5574f4a6" Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.190162 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-r2hxs" event={"ID":"e655f0bd-0705-4b88-b0c8-d9c184c96f99","Type":"ContainerDied","Data":"12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28"} Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.190198 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12966ad92984a238a80dea4ea207591817cc8fdedd5be061097e95e902897c28" Dec 02 09:06:10 crc kubenswrapper[4895]: I1202 09:06:10.190291 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-r2hxs" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.635233 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-g4ljc"] Dec 02 09:06:11 crc kubenswrapper[4895]: E1202 09:06:11.637694 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" containerName="mariadb-account-create-update" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.637722 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" containerName="mariadb-account-create-update" Dec 02 09:06:11 crc kubenswrapper[4895]: E1202 09:06:11.637792 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e655f0bd-0705-4b88-b0c8-d9c184c96f99" containerName="mariadb-database-create" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.637799 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e655f0bd-0705-4b88-b0c8-d9c184c96f99" containerName="mariadb-database-create" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.637979 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e655f0bd-0705-4b88-b0c8-d9c184c96f99" containerName="mariadb-database-create" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.638003 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" containerName="mariadb-account-create-update" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.638804 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.644339 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.644606 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-k4flr" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.655691 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-g4ljc"] Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.752815 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.752958 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpgnl\" (UniqueName: \"kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.752985 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.855095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpgnl\" (UniqueName: \"kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.855141 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.855238 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.862715 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.863316 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.872695 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpgnl\" (UniqueName: \"kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl\") pod \"heat-db-sync-g4ljc\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:11 crc kubenswrapper[4895]: I1202 09:06:11.975338 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:12 crc kubenswrapper[4895]: I1202 09:06:12.281911 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-g4ljc"] Dec 02 09:06:12 crc kubenswrapper[4895]: W1202 09:06:12.299992 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c787d13_9ab0_4db2_842f_51cf06ef21dd.slice/crio-a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2 WatchSource:0}: Error finding container a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2: Status 404 returned error can't find the container with id a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2 Dec 02 09:06:13 crc kubenswrapper[4895]: I1202 09:06:13.252866 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-g4ljc" event={"ID":"2c787d13-9ab0-4db2-842f-51cf06ef21dd","Type":"ContainerStarted","Data":"a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2"} Dec 02 09:06:15 crc kubenswrapper[4895]: I1202 09:06:15.417381 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:15 crc kubenswrapper[4895]: I1202 09:06:15.417995 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:21 crc kubenswrapper[4895]: I1202 09:06:21.361111 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-g4ljc" event={"ID":"2c787d13-9ab0-4db2-842f-51cf06ef21dd","Type":"ContainerStarted","Data":"10c4c16cd26a64bae6114cb3265f70e04fa3f40bde709b3a1dc0b6e517392aa8"} Dec 02 09:06:21 crc kubenswrapper[4895]: I1202 09:06:21.384243 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-g4ljc" podStartSLOduration=1.7002703289999999 podStartE2EDuration="10.384214841s" podCreationTimestamp="2025-12-02 09:06:11 +0000 UTC" firstStartedPulling="2025-12-02 09:06:12.302417833 +0000 UTC m=+6183.473277446" lastFinishedPulling="2025-12-02 09:06:20.986362345 +0000 UTC m=+6192.157221958" observedRunningTime="2025-12-02 09:06:21.375737187 +0000 UTC m=+6192.546596810" watchObservedRunningTime="2025-12-02 09:06:21.384214841 +0000 UTC m=+6192.555074464" Dec 02 09:06:24 crc kubenswrapper[4895]: I1202 09:06:24.410822 4895 generic.go:334] "Generic (PLEG): container finished" podID="2c787d13-9ab0-4db2-842f-51cf06ef21dd" containerID="10c4c16cd26a64bae6114cb3265f70e04fa3f40bde709b3a1dc0b6e517392aa8" exitCode=0 Dec 02 09:06:24 crc kubenswrapper[4895]: I1202 09:06:24.411638 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-g4ljc" event={"ID":"2c787d13-9ab0-4db2-842f-51cf06ef21dd","Type":"ContainerDied","Data":"10c4c16cd26a64bae6114cb3265f70e04fa3f40bde709b3a1dc0b6e517392aa8"} Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.421126 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66c85bfb6f-gnhhn" podUID="77721628-6ade-42aa-bce2-e4d481d4d76f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.882754 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.985261 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle\") pod \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.985325 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data\") pod \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.985575 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpgnl\" (UniqueName: \"kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl\") pod \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\" (UID: \"2c787d13-9ab0-4db2-842f-51cf06ef21dd\") " Dec 02 09:06:25 crc kubenswrapper[4895]: I1202 09:06:25.995725 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl" (OuterVolumeSpecName: "kube-api-access-qpgnl") pod "2c787d13-9ab0-4db2-842f-51cf06ef21dd" (UID: "2c787d13-9ab0-4db2-842f-51cf06ef21dd"). InnerVolumeSpecName "kube-api-access-qpgnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.016433 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c787d13-9ab0-4db2-842f-51cf06ef21dd" (UID: "2c787d13-9ab0-4db2-842f-51cf06ef21dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.091100 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.091169 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpgnl\" (UniqueName: \"kubernetes.io/projected/2c787d13-9ab0-4db2-842f-51cf06ef21dd-kube-api-access-qpgnl\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.093653 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data" (OuterVolumeSpecName: "config-data") pod "2c787d13-9ab0-4db2-842f-51cf06ef21dd" (UID: "2c787d13-9ab0-4db2-842f-51cf06ef21dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.192453 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c787d13-9ab0-4db2-842f-51cf06ef21dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.443821 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-g4ljc" event={"ID":"2c787d13-9ab0-4db2-842f-51cf06ef21dd","Type":"ContainerDied","Data":"a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2"} Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.443894 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-g4ljc" Dec 02 09:06:26 crc kubenswrapper[4895]: I1202 09:06:26.443907 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a978804b5962db1c2ae1a94d2c135cf0ac900549497832001a1aef6fdab374c2" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.609784 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5d65795cf7-2g5fx"] Dec 02 09:06:27 crc kubenswrapper[4895]: E1202 09:06:27.610387 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c787d13-9ab0-4db2-842f-51cf06ef21dd" containerName="heat-db-sync" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.610408 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c787d13-9ab0-4db2-842f-51cf06ef21dd" containerName="heat-db-sync" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.610665 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c787d13-9ab0-4db2-842f-51cf06ef21dd" containerName="heat-db-sync" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.611581 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.616504 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.616673 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-k4flr" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.616857 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.631618 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data-custom\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.631691 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.631758 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl54l\" (UniqueName: \"kubernetes.io/projected/98206baf-726c-4fa4-ae92-8048974e2e1d-kube-api-access-gl54l\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.631814 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-combined-ca-bundle\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.637563 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d65795cf7-2g5fx"] Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.729782 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6cf4c6755f-682sl"] Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.732018 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.735101 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data-custom\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.735233 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.735348 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl54l\" (UniqueName: \"kubernetes.io/projected/98206baf-726c-4fa4-ae92-8048974e2e1d-kube-api-access-gl54l\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.735481 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-combined-ca-bundle\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.737008 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.741182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-combined-ca-bundle\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.750709 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data-custom\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.752945 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98206baf-726c-4fa4-ae92-8048974e2e1d-config-data\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.754384 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cf4c6755f-682sl"] Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.758324 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl54l\" (UniqueName: \"kubernetes.io/projected/98206baf-726c-4fa4-ae92-8048974e2e1d-kube-api-access-gl54l\") pod \"heat-engine-5d65795cf7-2g5fx\" (UID: \"98206baf-726c-4fa4-ae92-8048974e2e1d\") " pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.827393 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7bf7577d5f-qmt6j"] Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.830173 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.837162 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.840815 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-combined-ca-bundle\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.841004 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data-custom\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.841050 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l8n\" (UniqueName: \"kubernetes.io/projected/e8702398-a4fe-4a0b-94db-b747393df1a4-kube-api-access-f4l8n\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.841083 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.864847 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf7577d5f-qmt6j"] Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-combined-ca-bundle\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942658 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data-custom\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942768 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-combined-ca-bundle\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942809 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svdfw\" (UniqueName: \"kubernetes.io/projected/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-kube-api-access-svdfw\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942832 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942920 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data-custom\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942958 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l8n\" (UniqueName: \"kubernetes.io/projected/e8702398-a4fe-4a0b-94db-b747393df1a4-kube-api-access-f4l8n\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.942983 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.948676 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.956456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data-custom\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.962333 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-config-data\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.964143 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8702398-a4fe-4a0b-94db-b747393df1a4-combined-ca-bundle\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:27 crc kubenswrapper[4895]: I1202 09:06:27.970904 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l8n\" (UniqueName: \"kubernetes.io/projected/e8702398-a4fe-4a0b-94db-b747393df1a4-kube-api-access-f4l8n\") pod \"heat-cfnapi-6cf4c6755f-682sl\" (UID: \"e8702398-a4fe-4a0b-94db-b747393df1a4\") " pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.045488 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-combined-ca-bundle\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.045567 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data-custom\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.045646 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svdfw\" (UniqueName: \"kubernetes.io/projected/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-kube-api-access-svdfw\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.045669 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.049811 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-combined-ca-bundle\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.050506 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data-custom\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.051501 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-config-data\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.077341 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svdfw\" (UniqueName: \"kubernetes.io/projected/3a8f67f5-e0f7-4d77-a9e3-27edd04f368d-kube-api-access-svdfw\") pod \"heat-api-7bf7577d5f-qmt6j\" (UID: \"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d\") " pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.184989 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.204357 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.566503 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5d65795cf7-2g5fx"] Dec 02 09:06:28 crc kubenswrapper[4895]: W1202 09:06:28.573144 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98206baf_726c_4fa4_ae92_8048974e2e1d.slice/crio-397e0939fc3b98cf7f03d69c966d6930078642890712723ab112298e04bf938a WatchSource:0}: Error finding container 397e0939fc3b98cf7f03d69c966d6930078642890712723ab112298e04bf938a: Status 404 returned error can't find the container with id 397e0939fc3b98cf7f03d69c966d6930078642890712723ab112298e04bf938a Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.953090 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6cf4c6755f-682sl"] Dec 02 09:06:28 crc kubenswrapper[4895]: W1202 09:06:28.958883 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a8f67f5_e0f7_4d77_a9e3_27edd04f368d.slice/crio-d160726a8c49ef17ea02085738448ffc716da6ca315fd886cd8f8cfa92caa6c9 WatchSource:0}: Error finding container d160726a8c49ef17ea02085738448ffc716da6ca315fd886cd8f8cfa92caa6c9: Status 404 returned error can't find the container with id d160726a8c49ef17ea02085738448ffc716da6ca315fd886cd8f8cfa92caa6c9 Dec 02 09:06:28 crc kubenswrapper[4895]: I1202 09:06:28.966921 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7bf7577d5f-qmt6j"] Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.548784 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7577d5f-qmt6j" event={"ID":"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d","Type":"ContainerStarted","Data":"d160726a8c49ef17ea02085738448ffc716da6ca315fd886cd8f8cfa92caa6c9"} Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.550441 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d65795cf7-2g5fx" event={"ID":"98206baf-726c-4fa4-ae92-8048974e2e1d","Type":"ContainerStarted","Data":"93f0f1d0c02e3a54b1a49be0bfb89e1aed278175c597a7c3881b72ade1bc6daf"} Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.550497 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5d65795cf7-2g5fx" event={"ID":"98206baf-726c-4fa4-ae92-8048974e2e1d","Type":"ContainerStarted","Data":"397e0939fc3b98cf7f03d69c966d6930078642890712723ab112298e04bf938a"} Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.550534 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.551496 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" event={"ID":"e8702398-a4fe-4a0b-94db-b747393df1a4","Type":"ContainerStarted","Data":"49619f2ea5e01e1700964afab515ba3994d551ecd7b41262b143bbf09bbc042a"} Dec 02 09:06:29 crc kubenswrapper[4895]: I1202 09:06:29.573344 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5d65795cf7-2g5fx" podStartSLOduration=2.573320618 podStartE2EDuration="2.573320618s" podCreationTimestamp="2025-12-02 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:06:29.568714244 +0000 UTC m=+6200.739573877" watchObservedRunningTime="2025-12-02 09:06:29.573320618 +0000 UTC m=+6200.744180231" Dec 02 09:06:31 crc kubenswrapper[4895]: I1202 09:06:31.595503 4895 scope.go:117] "RemoveContainer" containerID="ac39b4ea1f3525107544fb98140ed2c16e32d35c021325ebe0a6fe774e827d2e" Dec 02 09:06:31 crc kubenswrapper[4895]: I1202 09:06:31.719621 4895 scope.go:117] "RemoveContainer" containerID="89cf5bc2f3e3a6102e5f96662b60004fe0576e6ba2cc20d8bb44190d4ad4b432" Dec 02 09:06:31 crc kubenswrapper[4895]: I1202 09:06:31.763356 4895 scope.go:117] "RemoveContainer" containerID="8072ee769b8e8452e2eccecfc680da0cab8d9f6d03cba69fb765cff36cb4e93c" Dec 02 09:06:31 crc kubenswrapper[4895]: I1202 09:06:31.820421 4895 scope.go:117] "RemoveContainer" containerID="7f396d433321705302823b9107bbacfefd5ae4d6825c0b6f6e0a00cfe6dc7c8b" Dec 02 09:06:33 crc kubenswrapper[4895]: I1202 09:06:33.854074 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7bf7577d5f-qmt6j" event={"ID":"3a8f67f5-e0f7-4d77-a9e3-27edd04f368d","Type":"ContainerStarted","Data":"ffc88cc9b2141e2d51d71c1ed6ab9cf7df3ee6f70bd6d05e193dbdfc5c9a1ade"} Dec 02 09:06:33 crc kubenswrapper[4895]: I1202 09:06:33.856801 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:33 crc kubenswrapper[4895]: I1202 09:06:33.910054 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7bf7577d5f-qmt6j" podStartSLOduration=3.005076999 podStartE2EDuration="6.910011079s" podCreationTimestamp="2025-12-02 09:06:27 +0000 UTC" firstStartedPulling="2025-12-02 09:06:28.962653007 +0000 UTC m=+6200.133512620" lastFinishedPulling="2025-12-02 09:06:32.867587087 +0000 UTC m=+6204.038446700" observedRunningTime="2025-12-02 09:06:33.885225558 +0000 UTC m=+6205.056085171" watchObservedRunningTime="2025-12-02 09:06:33.910011079 +0000 UTC m=+6205.080870702" Dec 02 09:06:34 crc kubenswrapper[4895]: I1202 09:06:34.868391 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" event={"ID":"e8702398-a4fe-4a0b-94db-b747393df1a4","Type":"ContainerStarted","Data":"fc6b9db6cc389582ed4f6d67aba0b724f9801fa5b3311c51cfa0beeb8917fb44"} Dec 02 09:06:34 crc kubenswrapper[4895]: I1202 09:06:34.868771 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:34 crc kubenswrapper[4895]: I1202 09:06:34.887109 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" podStartSLOduration=2.678808391 podStartE2EDuration="7.887088538s" podCreationTimestamp="2025-12-02 09:06:27 +0000 UTC" firstStartedPulling="2025-12-02 09:06:28.964673329 +0000 UTC m=+6200.135532942" lastFinishedPulling="2025-12-02 09:06:34.172953476 +0000 UTC m=+6205.343813089" observedRunningTime="2025-12-02 09:06:34.885909242 +0000 UTC m=+6206.056768865" watchObservedRunningTime="2025-12-02 09:06:34.887088538 +0000 UTC m=+6206.057948151" Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.474652 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.475094 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.475155 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.475805 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.475865 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a" gracePeriod=600 Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.881415 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a" exitCode=0 Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.881482 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a"} Dec 02 09:06:35 crc kubenswrapper[4895]: I1202 09:06:35.881569 4895 scope.go:117] "RemoveContainer" containerID="d95d9d738930381cc247f4992573f82ec7c89226f23be0d771d1a3b1b3cf1812" Dec 02 09:06:36 crc kubenswrapper[4895]: I1202 09:06:36.892500 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444"} Dec 02 09:06:37 crc kubenswrapper[4895]: I1202 09:06:37.361010 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:39 crc kubenswrapper[4895]: I1202 09:06:39.250298 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66c85bfb6f-gnhhn" Dec 02 09:06:39 crc kubenswrapper[4895]: I1202 09:06:39.347125 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:06:39 crc kubenswrapper[4895]: I1202 09:06:39.347382 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon-log" containerID="cri-o://adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697" gracePeriod=30 Dec 02 09:06:39 crc kubenswrapper[4895]: I1202 09:06:39.347907 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" containerID="cri-o://49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac" gracePeriod=30 Dec 02 09:06:39 crc kubenswrapper[4895]: I1202 09:06:39.929216 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7bf7577d5f-qmt6j" Dec 02 09:06:43 crc kubenswrapper[4895]: I1202 09:06:43.002363 4895 generic.go:334] "Generic (PLEG): container finished" podID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerID="49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac" exitCode=0 Dec 02 09:06:43 crc kubenswrapper[4895]: I1202 09:06:43.003130 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerDied","Data":"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac"} Dec 02 09:06:43 crc kubenswrapper[4895]: I1202 09:06:43.368518 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 02 09:06:44 crc kubenswrapper[4895]: I1202 09:06:44.582333 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6cf4c6755f-682sl" Dec 02 09:06:47 crc kubenswrapper[4895]: I1202 09:06:47.982670 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5d65795cf7-2g5fx" Dec 02 09:06:53 crc kubenswrapper[4895]: I1202 09:06:53.368401 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 02 09:06:56 crc kubenswrapper[4895]: I1202 09:06:56.881522 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq"] Dec 02 09:06:56 crc kubenswrapper[4895]: I1202 09:06:56.885096 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:56 crc kubenswrapper[4895]: I1202 09:06:56.888405 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 09:06:56 crc kubenswrapper[4895]: I1202 09:06:56.894809 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq"] Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.014826 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tsd\" (UniqueName: \"kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.015278 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.015411 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.117580 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tsd\" (UniqueName: \"kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.118373 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.118427 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.118926 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.119039 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.145914 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tsd\" (UniqueName: \"kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.226941 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:06:57 crc kubenswrapper[4895]: I1202 09:06:57.686545 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq"] Dec 02 09:06:58 crc kubenswrapper[4895]: I1202 09:06:58.195560 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerStarted","Data":"7e9c206762e6d3ca8c69a797f49550072f676d327888d899aa2c5238185cd6a9"} Dec 02 09:06:58 crc kubenswrapper[4895]: I1202 09:06:58.195937 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerStarted","Data":"fb76c40ec2ec5ca8f0cad45318ba42dab414bfd07f9b4c60715ad8af98dc12e9"} Dec 02 09:06:59 crc kubenswrapper[4895]: I1202 09:06:59.207128 4895 generic.go:334] "Generic (PLEG): container finished" podID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerID="7e9c206762e6d3ca8c69a797f49550072f676d327888d899aa2c5238185cd6a9" exitCode=0 Dec 02 09:06:59 crc kubenswrapper[4895]: I1202 09:06:59.207285 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerDied","Data":"7e9c206762e6d3ca8c69a797f49550072f676d327888d899aa2c5238185cd6a9"} Dec 02 09:07:03 crc kubenswrapper[4895]: I1202 09:07:03.368131 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-76bcdfd5df-n7r77" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 02 09:07:03 crc kubenswrapper[4895]: I1202 09:07:03.368787 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:07:04 crc kubenswrapper[4895]: I1202 09:07:04.261723 4895 generic.go:334] "Generic (PLEG): container finished" podID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerID="1129e01637acd94d3fce510cdbe1ee924c8fb8ba7d5779820a29c0c7bb5b8b50" exitCode=0 Dec 02 09:07:04 crc kubenswrapper[4895]: I1202 09:07:04.261840 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerDied","Data":"1129e01637acd94d3fce510cdbe1ee924c8fb8ba7d5779820a29c0c7bb5b8b50"} Dec 02 09:07:05 crc kubenswrapper[4895]: I1202 09:07:05.274299 4895 generic.go:334] "Generic (PLEG): container finished" podID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerID="c4003e42277609ffcfe58144e24911a0a846b2b62da110518dca167bf92c759b" exitCode=0 Dec 02 09:07:05 crc kubenswrapper[4895]: I1202 09:07:05.274344 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerDied","Data":"c4003e42277609ffcfe58144e24911a0a846b2b62da110518dca167bf92c759b"} Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.049479 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rwhl6"] Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.060916 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p5cfg"] Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.069837 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rwhl6"] Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.078380 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p5cfg"] Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.727979 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.829051 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util\") pod \"223b0e53-4f79-4ffb-bf12-38b19193e535\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.829225 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle\") pod \"223b0e53-4f79-4ffb-bf12-38b19193e535\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.829299 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6tsd\" (UniqueName: \"kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd\") pod \"223b0e53-4f79-4ffb-bf12-38b19193e535\" (UID: \"223b0e53-4f79-4ffb-bf12-38b19193e535\") " Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.831544 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle" (OuterVolumeSpecName: "bundle") pod "223b0e53-4f79-4ffb-bf12-38b19193e535" (UID: "223b0e53-4f79-4ffb-bf12-38b19193e535"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.835534 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd" (OuterVolumeSpecName: "kube-api-access-z6tsd") pod "223b0e53-4f79-4ffb-bf12-38b19193e535" (UID: "223b0e53-4f79-4ffb-bf12-38b19193e535"). InnerVolumeSpecName "kube-api-access-z6tsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.840989 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util" (OuterVolumeSpecName: "util") pod "223b0e53-4f79-4ffb-bf12-38b19193e535" (UID: "223b0e53-4f79-4ffb-bf12-38b19193e535"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.932449 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6tsd\" (UniqueName: \"kubernetes.io/projected/223b0e53-4f79-4ffb-bf12-38b19193e535-kube-api-access-z6tsd\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.932487 4895 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-util\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:06 crc kubenswrapper[4895]: I1202 09:07:06.932543 4895 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/223b0e53-4f79-4ffb-bf12-38b19193e535-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.037295 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b117-account-create-update-t7r7w"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.047855 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b117-account-create-update-t7r7w"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.060486 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-55cd-account-create-update-vbfqv"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.072117 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ac5f-account-create-update-8qtm2"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.084687 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-55cd-account-create-update-vbfqv"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.096096 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ac5f-account-create-update-8qtm2"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.105606 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-86jd8"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.115476 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-86jd8"] Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.155535 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2079ee-4b91-4755-8f76-9a57e60b27ba" path="/var/lib/kubelet/pods/2c2079ee-4b91-4755-8f76-9a57e60b27ba/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.158013 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f04db4c-ba44-4d58-8471-7ad3abfc0eaf" path="/var/lib/kubelet/pods/3f04db4c-ba44-4d58-8471-7ad3abfc0eaf/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.160680 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53745490-f6e4-4f78-964b-5a52444211b8" path="/var/lib/kubelet/pods/53745490-f6e4-4f78-964b-5a52444211b8/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.162055 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b6ca0c-81ea-4711-bc3c-d9a7a205543b" path="/var/lib/kubelet/pods/53b6ca0c-81ea-4711-bc3c-d9a7a205543b/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.163608 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c143fa-5ab3-4e36-9da4-69095eedf045" path="/var/lib/kubelet/pods/b8c143fa-5ab3-4e36-9da4-69095eedf045/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.165296 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febef79d-8c1e-4f62-b362-268f7d459291" path="/var/lib/kubelet/pods/febef79d-8c1e-4f62-b362-268f7d459291/volumes" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.295845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" event={"ID":"223b0e53-4f79-4ffb-bf12-38b19193e535","Type":"ContainerDied","Data":"fb76c40ec2ec5ca8f0cad45318ba42dab414bfd07f9b4c60715ad8af98dc12e9"} Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.295881 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq" Dec 02 09:07:07 crc kubenswrapper[4895]: I1202 09:07:07.295892 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb76c40ec2ec5ca8f0cad45318ba42dab414bfd07f9b4c60715ad8af98dc12e9" Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.772950 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.921404 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.921565 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.921653 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.922362 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs" (OuterVolumeSpecName: "logs") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.921733 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.922571 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s7ww\" (UniqueName: \"kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.923076 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e401d14-d2d7-4b54-a61d-c40b8125462b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:09 crc kubenswrapper[4895]: I1202 09:07:09.929040 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww" (OuterVolumeSpecName: "kube-api-access-9s7ww") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "kube-api-access-9s7ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.011887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.019241 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts" (OuterVolumeSpecName: "scripts") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.023866 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data" (OuterVolumeSpecName: "config-data") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.024535 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") pod \"1e401d14-d2d7-4b54-a61d-c40b8125462b\" (UID: \"1e401d14-d2d7-4b54-a61d-c40b8125462b\") " Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.025043 4895 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1e401d14-d2d7-4b54-a61d-c40b8125462b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.025066 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.025078 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s7ww\" (UniqueName: \"kubernetes.io/projected/1e401d14-d2d7-4b54-a61d-c40b8125462b-kube-api-access-9s7ww\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:10 crc kubenswrapper[4895]: W1202 09:07:10.025112 4895 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1e401d14-d2d7-4b54-a61d-c40b8125462b/volumes/kubernetes.io~configmap/config-data Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.025156 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data" (OuterVolumeSpecName: "config-data") pod "1e401d14-d2d7-4b54-a61d-c40b8125462b" (UID: "1e401d14-d2d7-4b54-a61d-c40b8125462b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.127435 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e401d14-d2d7-4b54-a61d-c40b8125462b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.328115 4895 generic.go:334] "Generic (PLEG): container finished" podID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerID="adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697" exitCode=137 Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.328175 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerDied","Data":"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697"} Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.328191 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bcdfd5df-n7r77" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.328227 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bcdfd5df-n7r77" event={"ID":"1e401d14-d2d7-4b54-a61d-c40b8125462b","Type":"ContainerDied","Data":"002b9f55b24c432f4eeb0d1842b7617a86fbcc59d1d43fd8c788e2bdc89e6a56"} Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.328251 4895 scope.go:117] "RemoveContainer" containerID="49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.375005 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.387181 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76bcdfd5df-n7r77"] Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.521561 4895 scope.go:117] "RemoveContainer" containerID="adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.560552 4895 scope.go:117] "RemoveContainer" containerID="49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac" Dec 02 09:07:10 crc kubenswrapper[4895]: E1202 09:07:10.561126 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac\": container with ID starting with 49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac not found: ID does not exist" containerID="49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.561188 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac"} err="failed to get container status \"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac\": rpc error: code = NotFound desc = could not find container \"49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac\": container with ID starting with 49bf1446cd01615990a4aeffdfdaf8178a2999c7aaf9150e89df5682af2088ac not found: ID does not exist" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.561224 4895 scope.go:117] "RemoveContainer" containerID="adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697" Dec 02 09:07:10 crc kubenswrapper[4895]: E1202 09:07:10.561664 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697\": container with ID starting with adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697 not found: ID does not exist" containerID="adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697" Dec 02 09:07:10 crc kubenswrapper[4895]: I1202 09:07:10.561705 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697"} err="failed to get container status \"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697\": rpc error: code = NotFound desc = could not find container \"adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697\": container with ID starting with adab4eb8c95fa8a014dcc333415cbe8ee7b85e0e6016ea8d665bea1999729697 not found: ID does not exist" Dec 02 09:07:11 crc kubenswrapper[4895]: I1202 09:07:11.153806 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" path="/var/lib/kubelet/pods/1e401d14-d2d7-4b54-a61d-c40b8125462b/volumes" Dec 02 09:07:16 crc kubenswrapper[4895]: I1202 09:07:16.133151 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgzzz"] Dec 02 09:07:16 crc kubenswrapper[4895]: I1202 09:07:16.146265 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgzzz"] Dec 02 09:07:17 crc kubenswrapper[4895]: I1202 09:07:17.175169 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9341c1-1d76-442c-b16e-6afcb266c131" path="/var/lib/kubelet/pods/0c9341c1-1d76-442c-b16e-6afcb266c131/volumes" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.870181 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w"] Dec 02 09:07:20 crc kubenswrapper[4895]: E1202 09:07:20.870945 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="util" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.870959 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="util" Dec 02 09:07:20 crc kubenswrapper[4895]: E1202 09:07:20.870979 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="pull" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.870985 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="pull" Dec 02 09:07:20 crc kubenswrapper[4895]: E1202 09:07:20.870996 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon-log" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871003 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon-log" Dec 02 09:07:20 crc kubenswrapper[4895]: E1202 09:07:20.871017 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="extract" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871023 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="extract" Dec 02 09:07:20 crc kubenswrapper[4895]: E1202 09:07:20.871030 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871037 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871239 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871251 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e401d14-d2d7-4b54-a61d-c40b8125462b" containerName="horizon-log" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.871266 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="223b0e53-4f79-4ffb-bf12-38b19193e535" containerName="extract" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.872028 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.874992 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-7qwbd" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.877084 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.877095 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.921558 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4ch\" (UniqueName: \"kubernetes.io/projected/28dc3134-e709-42e9-b347-b429f8404b0b-kube-api-access-ww4ch\") pod \"obo-prometheus-operator-668cf9dfbb-sll8w\" (UID: \"28dc3134-e709-42e9-b347-b429f8404b0b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" Dec 02 09:07:20 crc kubenswrapper[4895]: I1202 09:07:20.952550 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.023181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4ch\" (UniqueName: \"kubernetes.io/projected/28dc3134-e709-42e9-b347-b429f8404b0b-kube-api-access-ww4ch\") pod \"obo-prometheus-operator-668cf9dfbb-sll8w\" (UID: \"28dc3134-e709-42e9-b347-b429f8404b0b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.053592 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4ch\" (UniqueName: \"kubernetes.io/projected/28dc3134-e709-42e9-b347-b429f8404b0b-kube-api-access-ww4ch\") pod \"obo-prometheus-operator-668cf9dfbb-sll8w\" (UID: \"28dc3134-e709-42e9-b347-b429f8404b0b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.065757 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.067109 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.083108 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-v6pwd" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.083391 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.099764 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.101515 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.127936 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.128041 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.128140 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.128281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.140230 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.209533 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.239890 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.239963 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.240079 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.240143 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.259456 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.298894 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.300283 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b967677-6953-486c-96f5-8e8d2b7b4735-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2\" (UID: \"4b967677-6953-486c-96f5-8e8d2b7b4735\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.301258 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/be63c50f-1ea7-4abe-92af-065880aa82bc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl\" (UID: \"be63c50f-1ea7-4abe-92af-065880aa82bc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.325519 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.404858 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-2v4cs"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.406938 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.410853 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-4zvm9" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.411146 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.425344 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-2v4cs"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.440807 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zgdhw"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.442431 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.450919 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zgdhw"] Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.457514 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-9j82n" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.466481 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.518010 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.572345 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwvpd\" (UniqueName: \"kubernetes.io/projected/42557ee6-0d59-41ae-a224-6f2d6aaac16e-kube-api-access-xwvpd\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.572413 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t6qk\" (UniqueName: \"kubernetes.io/projected/20621657-8f32-4666-b066-24cd34782010-kube-api-access-9t6qk\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.572489 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20621657-8f32-4666-b066-24cd34782010-openshift-service-ca\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.572602 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/42557ee6-0d59-41ae-a224-6f2d6aaac16e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.676913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20621657-8f32-4666-b066-24cd34782010-openshift-service-ca\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.677431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/42557ee6-0d59-41ae-a224-6f2d6aaac16e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.677502 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwvpd\" (UniqueName: \"kubernetes.io/projected/42557ee6-0d59-41ae-a224-6f2d6aaac16e-kube-api-access-xwvpd\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.677543 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t6qk\" (UniqueName: \"kubernetes.io/projected/20621657-8f32-4666-b066-24cd34782010-kube-api-access-9t6qk\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.681924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/20621657-8f32-4666-b066-24cd34782010-openshift-service-ca\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.688919 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/42557ee6-0d59-41ae-a224-6f2d6aaac16e-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.719992 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwvpd\" (UniqueName: \"kubernetes.io/projected/42557ee6-0d59-41ae-a224-6f2d6aaac16e-kube-api-access-xwvpd\") pod \"observability-operator-d8bb48f5d-2v4cs\" (UID: \"42557ee6-0d59-41ae-a224-6f2d6aaac16e\") " pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.722087 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t6qk\" (UniqueName: \"kubernetes.io/projected/20621657-8f32-4666-b066-24cd34782010-kube-api-access-9t6qk\") pod \"perses-operator-5446b9c989-zgdhw\" (UID: \"20621657-8f32-4666-b066-24cd34782010\") " pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.741298 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.770939 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:21 crc kubenswrapper[4895]: I1202 09:07:21.965056 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w"] Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.198419 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2"] Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.327771 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl"] Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.419646 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-zgdhw"] Dec 02 09:07:22 crc kubenswrapper[4895]: W1202 09:07:22.420830 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20621657_8f32_4666_b066_24cd34782010.slice/crio-a43963d4e50f2a9f010582e8e598abd380aa27df67f6ba37d6d59bfede50ad01 WatchSource:0}: Error finding container a43963d4e50f2a9f010582e8e598abd380aa27df67f6ba37d6d59bfede50ad01: Status 404 returned error can't find the container with id a43963d4e50f2a9f010582e8e598abd380aa27df67f6ba37d6d59bfede50ad01 Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.446759 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-2v4cs"] Dec 02 09:07:22 crc kubenswrapper[4895]: W1202 09:07:22.461226 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42557ee6_0d59_41ae_a224_6f2d6aaac16e.slice/crio-7dfd3752be3773c414b3f33150f261d1d60dac2f31c12cc925399d96a7f6d331 WatchSource:0}: Error finding container 7dfd3752be3773c414b3f33150f261d1d60dac2f31c12cc925399d96a7f6d331: Status 404 returned error can't find the container with id 7dfd3752be3773c414b3f33150f261d1d60dac2f31c12cc925399d96a7f6d331 Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.594639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" event={"ID":"be63c50f-1ea7-4abe-92af-065880aa82bc","Type":"ContainerStarted","Data":"24f3b028ec1586661cc1f8f84118e95059b3acdfe6871e2bf2d7021962c06a99"} Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.597123 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" event={"ID":"4b967677-6953-486c-96f5-8e8d2b7b4735","Type":"ContainerStarted","Data":"e428234cc9437ebd76d3a46a64be97f8f4b0adb887d3ec1ef8ee246edd7b1687"} Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.598716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" event={"ID":"42557ee6-0d59-41ae-a224-6f2d6aaac16e","Type":"ContainerStarted","Data":"7dfd3752be3773c414b3f33150f261d1d60dac2f31c12cc925399d96a7f6d331"} Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.600062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" event={"ID":"28dc3134-e709-42e9-b347-b429f8404b0b","Type":"ContainerStarted","Data":"98b5bfd86d01d49837e86c56226b1744d23b667cc2c7764a2a501e02567eb23d"} Dec 02 09:07:22 crc kubenswrapper[4895]: I1202 09:07:22.601169 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" event={"ID":"20621657-8f32-4666-b066-24cd34782010","Type":"ContainerStarted","Data":"a43963d4e50f2a9f010582e8e598abd380aa27df67f6ba37d6d59bfede50ad01"} Dec 02 09:07:31 crc kubenswrapper[4895]: I1202 09:07:31.999085 4895 scope.go:117] "RemoveContainer" containerID="37415d328d83cda2a36bd73d9ccf0307bda7503b24038509c10dc0bbd1d99c07" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.043309 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m4pzc"] Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.060073 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m4pzc"] Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.159361 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf2c8d7-9918-4162-86a3-68074211ecdb" path="/var/lib/kubelet/pods/ddf2c8d7-9918-4162-86a3-68074211ecdb/volumes" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.433613 4895 scope.go:117] "RemoveContainer" containerID="474ca1e736651704b9b7179fb0341b9d7cea3973f5d792d97a2014d345e0e6ce" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.551918 4895 scope.go:117] "RemoveContainer" containerID="08764e8907291fc0cd589e94ecb5de2e6a06891b5ff9971afcf38599b9f62c61" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.633307 4895 scope.go:117] "RemoveContainer" containerID="7c7141925d7ca869b099b95d2dc6329e23ecbb6e4665edad03d84b68ed82c03a" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.784066 4895 scope.go:117] "RemoveContainer" containerID="33933149fc4e814612d9e8a55824e49dbc990abd5ad9a7dddba28905e0caf926" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.862051 4895 scope.go:117] "RemoveContainer" containerID="01778b44a5637b1b452f9e8d62db32b8f175d3bcdab0f73044cb4e91880d7d5a" Dec 02 09:07:35 crc kubenswrapper[4895]: I1202 09:07:35.948537 4895 scope.go:117] "RemoveContainer" containerID="bff4156dcdd4edb34ab340cff93d5c0c7effff7e9e861e491111650ec66c6516" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.066975 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mkdft"] Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.086585 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mkdft"] Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.236717 4895 scope.go:117] "RemoveContainer" containerID="5846aea8535c4c81f42aa817588d7c069916674c6ec58d53f95ba7e50e1afd69" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.890058 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" event={"ID":"42557ee6-0d59-41ae-a224-6f2d6aaac16e","Type":"ContainerStarted","Data":"c864f994e7cf050b245b5196070b28178777aa224581c9cb3cdf95207405d5f9"} Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.891784 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.893705 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" event={"ID":"be63c50f-1ea7-4abe-92af-065880aa82bc","Type":"ContainerStarted","Data":"8722ed121783a8bcc4ada161ba6dcb7202619625c38905f9f82961f53b39df14"} Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.895510 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" event={"ID":"28dc3134-e709-42e9-b347-b429f8404b0b","Type":"ContainerStarted","Data":"3a50b2f0fabc4f1963386966bd9100d5426d42e977e7c254dfae89e8af034af6"} Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.896972 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" event={"ID":"4b967677-6953-486c-96f5-8e8d2b7b4735","Type":"ContainerStarted","Data":"03eb30b89806b65c2c2cb778d8f42e3cdc7294592f10e47c1667a316f9ecc665"} Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.899097 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.901657 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" event={"ID":"20621657-8f32-4666-b066-24cd34782010","Type":"ContainerStarted","Data":"22aecc2c2f1f65a418b19dab648a43588406e26df0ee9364cd203073ed59b13c"} Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.902513 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.918393 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-2v4cs" podStartSLOduration=2.584815808 podStartE2EDuration="15.918373994s" podCreationTimestamp="2025-12-02 09:07:21 +0000 UTC" firstStartedPulling="2025-12-02 09:07:22.463918761 +0000 UTC m=+6253.634778374" lastFinishedPulling="2025-12-02 09:07:35.797476947 +0000 UTC m=+6266.968336560" observedRunningTime="2025-12-02 09:07:36.914375459 +0000 UTC m=+6268.085235072" watchObservedRunningTime="2025-12-02 09:07:36.918373994 +0000 UTC m=+6268.089233607" Dec 02 09:07:36 crc kubenswrapper[4895]: I1202 09:07:36.946819 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl" podStartSLOduration=2.850445377 podStartE2EDuration="15.946778898s" podCreationTimestamp="2025-12-02 09:07:21 +0000 UTC" firstStartedPulling="2025-12-02 09:07:22.337877617 +0000 UTC m=+6253.508737230" lastFinishedPulling="2025-12-02 09:07:35.434211138 +0000 UTC m=+6266.605070751" observedRunningTime="2025-12-02 09:07:36.936069965 +0000 UTC m=+6268.106929588" watchObservedRunningTime="2025-12-02 09:07:36.946778898 +0000 UTC m=+6268.117638531" Dec 02 09:07:37 crc kubenswrapper[4895]: I1202 09:07:37.009119 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" podStartSLOduration=2.9875611859999998 podStartE2EDuration="16.009080288s" podCreationTimestamp="2025-12-02 09:07:21 +0000 UTC" firstStartedPulling="2025-12-02 09:07:22.424792703 +0000 UTC m=+6253.595652316" lastFinishedPulling="2025-12-02 09:07:35.446311805 +0000 UTC m=+6266.617171418" observedRunningTime="2025-12-02 09:07:36.999487439 +0000 UTC m=+6268.170347052" watchObservedRunningTime="2025-12-02 09:07:37.009080288 +0000 UTC m=+6268.179939901" Dec 02 09:07:37 crc kubenswrapper[4895]: I1202 09:07:37.033938 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-sll8w" podStartSLOduration=3.585428197 podStartE2EDuration="17.033916151s" podCreationTimestamp="2025-12-02 09:07:20 +0000 UTC" firstStartedPulling="2025-12-02 09:07:21.986803508 +0000 UTC m=+6253.157663121" lastFinishedPulling="2025-12-02 09:07:35.435291472 +0000 UTC m=+6266.606151075" observedRunningTime="2025-12-02 09:07:37.027360767 +0000 UTC m=+6268.198220390" watchObservedRunningTime="2025-12-02 09:07:37.033916151 +0000 UTC m=+6268.204775764" Dec 02 09:07:37 crc kubenswrapper[4895]: I1202 09:07:37.060470 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2" podStartSLOduration=2.838530207 podStartE2EDuration="16.060447557s" podCreationTimestamp="2025-12-02 09:07:21 +0000 UTC" firstStartedPulling="2025-12-02 09:07:22.211813313 +0000 UTC m=+6253.382672926" lastFinishedPulling="2025-12-02 09:07:35.433730663 +0000 UTC m=+6266.604590276" observedRunningTime="2025-12-02 09:07:37.055833184 +0000 UTC m=+6268.226692817" watchObservedRunningTime="2025-12-02 09:07:37.060447557 +0000 UTC m=+6268.231307180" Dec 02 09:07:37 crc kubenswrapper[4895]: I1202 09:07:37.166126 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf5d46f-feea-4549-ad6c-3bf285b528ff" path="/var/lib/kubelet/pods/9bf5d46f-feea-4549-ad6c-3bf285b528ff/volumes" Dec 02 09:07:41 crc kubenswrapper[4895]: I1202 09:07:41.774578 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-zgdhw" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.593389 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.595184 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" containerName="openstackclient" containerID="cri-o://f20b52de7232adf766d00e3e7d74feff75df26b00c31c85e3d851ff09c011c1f" gracePeriod=2 Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.605419 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.661880 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 09:07:44 crc kubenswrapper[4895]: E1202 09:07:44.662661 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" containerName="openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.662681 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" containerName="openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.671593 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" containerName="openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.672500 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.744397 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.774129 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" podUID="23eecc3a-5577-4505-8ecc-768aaf5228e6" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.824311 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.858783 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.859067 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5qn\" (UniqueName: \"kubernetes.io/projected/23eecc3a-5577-4505-8ecc-768aaf5228e6-kube-api-access-8g5qn\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.961675 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.961806 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5qn\" (UniqueName: \"kubernetes.io/projected/23eecc3a-5577-4505-8ecc-768aaf5228e6-kube-api-access-8g5qn\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.961885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.963001 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:44 crc kubenswrapper[4895]: I1202 09:07:44.972675 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23eecc3a-5577-4505-8ecc-768aaf5228e6-openstack-config-secret\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.037841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5qn\" (UniqueName: \"kubernetes.io/projected/23eecc3a-5577-4505-8ecc-768aaf5228e6-kube-api-access-8g5qn\") pod \"openstackclient\" (UID: \"23eecc3a-5577-4505-8ecc-768aaf5228e6\") " pod="openstack/openstackclient" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.078836 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.080856 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.090167 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.111098 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.137014 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5vbxq" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.173099 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdwd\" (UniqueName: \"kubernetes.io/projected/3b4b8f04-dca7-4b45-b66d-2a75b8c506cf-kube-api-access-5gdwd\") pod \"kube-state-metrics-0\" (UID: \"3b4b8f04-dca7-4b45-b66d-2a75b8c506cf\") " pod="openstack/kube-state-metrics-0" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.276418 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdwd\" (UniqueName: \"kubernetes.io/projected/3b4b8f04-dca7-4b45-b66d-2a75b8c506cf-kube-api-access-5gdwd\") pod \"kube-state-metrics-0\" (UID: \"3b4b8f04-dca7-4b45-b66d-2a75b8c506cf\") " pod="openstack/kube-state-metrics-0" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.355284 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdwd\" (UniqueName: \"kubernetes.io/projected/3b4b8f04-dca7-4b45-b66d-2a75b8c506cf-kube-api-access-5gdwd\") pod \"kube-state-metrics-0\" (UID: \"3b4b8f04-dca7-4b45-b66d-2a75b8c506cf\") " pod="openstack/kube-state-metrics-0" Dec 02 09:07:45 crc kubenswrapper[4895]: I1202 09:07:45.554286 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.334457 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.348300 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.355883 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-k88hx" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.364593 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.364676 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.364622 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.364944 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.395550 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.435403 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.436379 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.436793 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.437080 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.438187 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.438355 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnv5\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-kube-api-access-xbnv5\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.438527 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.540248 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.540697 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.540901 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.541042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnv5\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-kube-api-access-xbnv5\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.541200 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.541338 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.541478 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.545353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.546689 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.547412 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.548386 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.551525 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.554451 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.561024 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5365a5b3-61a8-47cf-a99e-6425e6af3784-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.576565 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnv5\" (UniqueName: \"kubernetes.io/projected/5365a5b3-61a8-47cf-a99e-6425e6af3784-kube-api-access-xbnv5\") pod \"alertmanager-metric-storage-0\" (UID: \"5365a5b3-61a8-47cf-a99e-6425e6af3784\") " pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.700477 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.759997 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 09:07:46 crc kubenswrapper[4895]: W1202 09:07:46.780781 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b4b8f04_dca7_4b45_b66d_2a75b8c506cf.slice/crio-f6f0d49eee84f236983579bea4e0c8852803697befe5927cc920ce74035bfef5 WatchSource:0}: Error finding container f6f0d49eee84f236983579bea4e0c8852803697befe5927cc920ce74035bfef5: Status 404 returned error can't find the container with id f6f0d49eee84f236983579bea4e0c8852803697befe5927cc920ce74035bfef5 Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.786633 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.790421 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.798149 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.798151 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.798151 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.798442 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9jj7n" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.801709 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.802997 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.825909 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851000 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851054 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38a14d48-8eb7-44be-b29b-5a8574b72d91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851095 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851166 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a14d48-8eb7-44be-b29b-5a8574b72d91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851192 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851212 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851227 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.851280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngwc\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-kube-api-access-mngwc\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.952710 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngwc\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-kube-api-access-mngwc\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.952848 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.952880 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38a14d48-8eb7-44be-b29b-5a8574b72d91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.952930 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.953025 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a14d48-8eb7-44be-b29b-5a8574b72d91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.953058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.953077 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.953094 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.962965 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.963777 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/38a14d48-8eb7-44be-b29b-5a8574b72d91-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.969106 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a14d48-8eb7-44be-b29b-5a8574b72d91-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.970338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.970985 4895 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.971009 4895 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7364876744e0be7be4c86a37bebd37676157c371041029f55539fed90d724f4e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.976862 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.982275 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a14d48-8eb7-44be-b29b-5a8574b72d91-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:46 crc kubenswrapper[4895]: I1202 09:07:46.999573 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngwc\" (UniqueName: \"kubernetes.io/projected/38a14d48-8eb7-44be-b29b-5a8574b72d91-kube-api-access-mngwc\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:47 crc kubenswrapper[4895]: I1202 09:07:47.067599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"23eecc3a-5577-4505-8ecc-768aaf5228e6","Type":"ContainerStarted","Data":"040150bd9b5aa4ed65b7220ac03480cf1a24394906d11565ae6113b8ecedfc7b"} Dec 02 09:07:47 crc kubenswrapper[4895]: I1202 09:07:47.074789 4895 generic.go:334] "Generic (PLEG): container finished" podID="cfe1444d-9391-4b5b-a770-14e55da2a63d" containerID="f20b52de7232adf766d00e3e7d74feff75df26b00c31c85e3d851ff09c011c1f" exitCode=137 Dec 02 09:07:47 crc kubenswrapper[4895]: I1202 09:07:47.100419 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3b4b8f04-dca7-4b45-b66d-2a75b8c506cf","Type":"ContainerStarted","Data":"f6f0d49eee84f236983579bea4e0c8852803697befe5927cc920ce74035bfef5"} Dec 02 09:07:47 crc kubenswrapper[4895]: I1202 09:07:47.122312 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-53b54dcf-8758-4f43-88e1-5bc45cac088b\") pod \"prometheus-metric-storage-0\" (UID: \"38a14d48-8eb7-44be-b29b-5a8574b72d91\") " pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:47 crc kubenswrapper[4895]: I1202 09:07:47.210178 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.227216 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.264138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config\") pod \"cfe1444d-9391-4b5b-a770-14e55da2a63d\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.264199 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret\") pod \"cfe1444d-9391-4b5b-a770-14e55da2a63d\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.264240 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5gtm\" (UniqueName: \"kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm\") pod \"cfe1444d-9391-4b5b-a770-14e55da2a63d\" (UID: \"cfe1444d-9391-4b5b-a770-14e55da2a63d\") " Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.275183 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm" (OuterVolumeSpecName: "kube-api-access-b5gtm") pod "cfe1444d-9391-4b5b-a770-14e55da2a63d" (UID: "cfe1444d-9391-4b5b-a770-14e55da2a63d"). InnerVolumeSpecName "kube-api-access-b5gtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.321611 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cfe1444d-9391-4b5b-a770-14e55da2a63d" (UID: "cfe1444d-9391-4b5b-a770-14e55da2a63d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.379628 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.379990 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5gtm\" (UniqueName: \"kubernetes.io/projected/cfe1444d-9391-4b5b-a770-14e55da2a63d-kube-api-access-b5gtm\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.477724 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cfe1444d-9391-4b5b-a770-14e55da2a63d" (UID: "cfe1444d-9391-4b5b-a770-14e55da2a63d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.487317 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cfe1444d-9391-4b5b-a770-14e55da2a63d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 09:07:48 crc kubenswrapper[4895]: W1202 09:07:47.503359 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5365a5b3_61a8_47cf_a99e_6425e6af3784.slice/crio-b10adb7cdd6410694143522915c2169577de40260c6840e711498ff3534dbfc1 WatchSource:0}: Error finding container b10adb7cdd6410694143522915c2169577de40260c6840e711498ff3534dbfc1: Status 404 returned error can't find the container with id b10adb7cdd6410694143522915c2169577de40260c6840e711498ff3534dbfc1 Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:47.532759 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.120066 4895 scope.go:117] "RemoveContainer" containerID="f20b52de7232adf766d00e3e7d74feff75df26b00c31c85e3d851ff09c011c1f" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.120293 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.124018 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3b4b8f04-dca7-4b45-b66d-2a75b8c506cf","Type":"ContainerStarted","Data":"c4b0c7314a93c6c577d856d9bb2b0684350bfc57c0e33ac2fdb0d74200df1112"} Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.124099 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.129026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"5365a5b3-61a8-47cf-a99e-6425e6af3784","Type":"ContainerStarted","Data":"b10adb7cdd6410694143522915c2169577de40260c6840e711498ff3534dbfc1"} Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.131575 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"23eecc3a-5577-4505-8ecc-768aaf5228e6","Type":"ContainerStarted","Data":"32bbc8f5efcc7365ad466d19eca1f43c258c95d8afcda511fab11f336d0cc862"} Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.164140 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.700173797 podStartE2EDuration="4.164115471s" podCreationTimestamp="2025-12-02 09:07:44 +0000 UTC" firstStartedPulling="2025-12-02 09:07:46.792135958 +0000 UTC m=+6277.962995571" lastFinishedPulling="2025-12-02 09:07:47.256077632 +0000 UTC m=+6278.426937245" observedRunningTime="2025-12-02 09:07:48.144239713 +0000 UTC m=+6279.315099326" watchObservedRunningTime="2025-12-02 09:07:48.164115471 +0000 UTC m=+6279.334975094" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.178107 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.178082976 podStartE2EDuration="4.178082976s" podCreationTimestamp="2025-12-02 09:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:07:48.172698069 +0000 UTC m=+6279.343557682" watchObservedRunningTime="2025-12-02 09:07:48.178082976 +0000 UTC m=+6279.348942599" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.179315 4895 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" podUID="23eecc3a-5577-4505-8ecc-768aaf5228e6" Dec 02 09:07:48 crc kubenswrapper[4895]: I1202 09:07:48.416360 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 09:07:49 crc kubenswrapper[4895]: I1202 09:07:49.157515 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe1444d-9391-4b5b-a770-14e55da2a63d" path="/var/lib/kubelet/pods/cfe1444d-9391-4b5b-a770-14e55da2a63d/volumes" Dec 02 09:07:49 crc kubenswrapper[4895]: I1202 09:07:49.158723 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerStarted","Data":"0c66dfe8488232d247595c932edb07be771619942870d3fa67891e7b68b10170"} Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.045148 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z52x8"] Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.059530 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z52x8"] Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.153138 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd9d193-9ec3-49b4-8cbb-050637dc04fc" path="/var/lib/kubelet/pods/dbd9d193-9ec3-49b4-8cbb-050637dc04fc/volumes" Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.217108 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"5365a5b3-61a8-47cf-a99e-6425e6af3784","Type":"ContainerStarted","Data":"5b1d4223a1c4f4fd20a1e836b0fdaeba1022f6e830d4c2979d160c7ce043406e"} Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.219237 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerStarted","Data":"329d151430e55d70db9b51a9fd1eeefd2a88caff4898133888ccfaa3109dd7ce"} Dec 02 09:07:55 crc kubenswrapper[4895]: I1202 09:07:55.559834 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 09:08:01 crc kubenswrapper[4895]: I1202 09:08:01.278818 4895 generic.go:334] "Generic (PLEG): container finished" podID="5365a5b3-61a8-47cf-a99e-6425e6af3784" containerID="5b1d4223a1c4f4fd20a1e836b0fdaeba1022f6e830d4c2979d160c7ce043406e" exitCode=0 Dec 02 09:08:01 crc kubenswrapper[4895]: I1202 09:08:01.278895 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"5365a5b3-61a8-47cf-a99e-6425e6af3784","Type":"ContainerDied","Data":"5b1d4223a1c4f4fd20a1e836b0fdaeba1022f6e830d4c2979d160c7ce043406e"} Dec 02 09:08:02 crc kubenswrapper[4895]: I1202 09:08:02.294095 4895 generic.go:334] "Generic (PLEG): container finished" podID="38a14d48-8eb7-44be-b29b-5a8574b72d91" containerID="329d151430e55d70db9b51a9fd1eeefd2a88caff4898133888ccfaa3109dd7ce" exitCode=0 Dec 02 09:08:02 crc kubenswrapper[4895]: I1202 09:08:02.294150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerDied","Data":"329d151430e55d70db9b51a9fd1eeefd2a88caff4898133888ccfaa3109dd7ce"} Dec 02 09:08:05 crc kubenswrapper[4895]: I1202 09:08:05.337607 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"5365a5b3-61a8-47cf-a99e-6425e6af3784","Type":"ContainerStarted","Data":"650eea23ee5017c6d53e9765b46f5d8fcf8416d58c1bd715b42532b7b76bdb83"} Dec 02 09:08:10 crc kubenswrapper[4895]: I1202 09:08:10.393775 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"5365a5b3-61a8-47cf-a99e-6425e6af3784","Type":"ContainerStarted","Data":"74c6a43bd7b9d375911ab14724e63c39f29a25f2bf1b7904b7d07b7c836108e4"} Dec 02 09:08:10 crc kubenswrapper[4895]: I1202 09:08:10.394829 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 02 09:08:10 crc kubenswrapper[4895]: I1202 09:08:10.397038 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 02 09:08:10 crc kubenswrapper[4895]: I1202 09:08:10.399061 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerStarted","Data":"0d839fb84a1dce1f4acf31b16924a3ff9c5d77161c4d1159e4f6758bb89dac15"} Dec 02 09:08:10 crc kubenswrapper[4895]: I1202 09:08:10.449762 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.65061402 podStartE2EDuration="24.449705607s" podCreationTimestamp="2025-12-02 09:07:46 +0000 UTC" firstStartedPulling="2025-12-02 09:07:47.513493326 +0000 UTC m=+6278.684352949" lastFinishedPulling="2025-12-02 09:08:04.312584923 +0000 UTC m=+6295.483444536" observedRunningTime="2025-12-02 09:08:10.430755066 +0000 UTC m=+6301.601614679" watchObservedRunningTime="2025-12-02 09:08:10.449705607 +0000 UTC m=+6301.620565220" Dec 02 09:08:14 crc kubenswrapper[4895]: I1202 09:08:14.449733 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerStarted","Data":"5db11eb3aaa6d885288cb7383798464791b7a275bfdb1033763a9c1f2fdbde4a"} Dec 02 09:08:18 crc kubenswrapper[4895]: I1202 09:08:18.510533 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"38a14d48-8eb7-44be-b29b-5a8574b72d91","Type":"ContainerStarted","Data":"9b735e905a7247edaa7036235f261a37768f333f84f4b2572fc85f12272b9753"} Dec 02 09:08:18 crc kubenswrapper[4895]: I1202 09:08:18.543420 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.630993751 podStartE2EDuration="33.543400965s" podCreationTimestamp="2025-12-02 09:07:45 +0000 UTC" firstStartedPulling="2025-12-02 09:07:48.529028932 +0000 UTC m=+6279.699888545" lastFinishedPulling="2025-12-02 09:08:17.441436146 +0000 UTC m=+6308.612295759" observedRunningTime="2025-12-02 09:08:18.538583894 +0000 UTC m=+6309.709443507" watchObservedRunningTime="2025-12-02 09:08:18.543400965 +0000 UTC m=+6309.714260578" Dec 02 09:08:22 crc kubenswrapper[4895]: I1202 09:08:22.210916 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 09:08:23 crc kubenswrapper[4895]: I1202 09:08:23.949129 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:08:23 crc kubenswrapper[4895]: I1202 09:08:23.956413 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:08:23 crc kubenswrapper[4895]: I1202 09:08:23.958840 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:08:23 crc kubenswrapper[4895]: I1202 09:08:23.959089 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:08:23 crc kubenswrapper[4895]: I1202 09:08:23.964119 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027541 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pklvx\" (UniqueName: \"kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027629 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027783 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027840 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027929 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.027972 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.129823 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.129890 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130005 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130029 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130056 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pklvx\" (UniqueName: \"kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130154 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130549 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.130647 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.137706 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.141129 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.141517 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.142026 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.149704 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pklvx\" (UniqueName: \"kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx\") pod \"ceilometer-0\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.310838 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:08:24 crc kubenswrapper[4895]: I1202 09:08:24.928920 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:08:25 crc kubenswrapper[4895]: I1202 09:08:25.607055 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerStarted","Data":"7aa781eb24e4fb5fbd05cb97a24560bddf34c98f910641405b1eba1f6a82a735"} Dec 02 09:08:27 crc kubenswrapper[4895]: I1202 09:08:27.731439 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerStarted","Data":"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849"} Dec 02 09:08:28 crc kubenswrapper[4895]: I1202 09:08:28.744977 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerStarted","Data":"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425"} Dec 02 09:08:29 crc kubenswrapper[4895]: I1202 09:08:29.761141 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerStarted","Data":"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea"} Dec 02 09:08:31 crc kubenswrapper[4895]: I1202 09:08:31.793305 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerStarted","Data":"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae"} Dec 02 09:08:31 crc kubenswrapper[4895]: I1202 09:08:31.794314 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:08:31 crc kubenswrapper[4895]: I1202 09:08:31.818402 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.49026328 podStartE2EDuration="8.818381219s" podCreationTimestamp="2025-12-02 09:08:23 +0000 UTC" firstStartedPulling="2025-12-02 09:08:24.942846317 +0000 UTC m=+6316.113705930" lastFinishedPulling="2025-12-02 09:08:31.270964246 +0000 UTC m=+6322.441823869" observedRunningTime="2025-12-02 09:08:31.815362655 +0000 UTC m=+6322.986222288" watchObservedRunningTime="2025-12-02 09:08:31.818381219 +0000 UTC m=+6322.989240842" Dec 02 09:08:32 crc kubenswrapper[4895]: I1202 09:08:32.211430 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 09:08:32 crc kubenswrapper[4895]: I1202 09:08:32.216521 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 09:08:32 crc kubenswrapper[4895]: I1202 09:08:32.806549 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 09:08:35 crc kubenswrapper[4895]: I1202 09:08:35.473577 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:08:35 crc kubenswrapper[4895]: I1202 09:08:35.474264 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.540138 4895 scope.go:117] "RemoveContainer" containerID="cd8e6ec08927f3cbf7386adde680b90d0ef08c316bfaa14fc6145f460b789bba" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.589486 4895 scope.go:117] "RemoveContainer" containerID="77f51277e69bf3adcf07535ca57b14667516f908f22b0db5454b6f7e6d17c7b8" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.656965 4895 scope.go:117] "RemoveContainer" containerID="353690f8148fdff66c772bee85e3488567ccf0cf29834fd7b68b7230298e0c84" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.701509 4895 scope.go:117] "RemoveContainer" containerID="7b331d71f1abc8f919e6533769397559c49bc492192bd67fb08b76f16ff29fcf" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.733490 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-qjtt9"] Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.735250 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.764723 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-qjtt9"] Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.818597 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1c7c-account-create-update-dmjp8"] Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.821021 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.823992 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.825883 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.826118 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w75rw\" (UniqueName: \"kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.828653 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1c7c-account-create-update-dmjp8"] Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.927780 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w75rw\" (UniqueName: \"kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.927964 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.928037 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfv4w\" (UniqueName: \"kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.928079 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.929309 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:36 crc kubenswrapper[4895]: I1202 09:08:36.955806 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w75rw\" (UniqueName: \"kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw\") pod \"aodh-db-create-qjtt9\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.030173 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfv4w\" (UniqueName: \"kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.030240 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.030976 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.046202 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfv4w\" (UniqueName: \"kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w\") pod \"aodh-1c7c-account-create-update-dmjp8\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.110831 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.144314 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.719496 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-qjtt9"] Dec 02 09:08:37 crc kubenswrapper[4895]: I1202 09:08:37.817573 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1c7c-account-create-update-dmjp8"] Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.049253 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-p26bh"] Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.063821 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-93ba-account-create-update-svtc4"] Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.078366 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-p26bh"] Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.092455 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-93ba-account-create-update-svtc4"] Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.642199 4895 generic.go:334] "Generic (PLEG): container finished" podID="e11aaedb-75a3-4da5-b86e-b358bd041d4e" containerID="fecf3f6e6d9a0db6f9fe64d641b47f8c3c68938466b6dd0098465b48c3b548ae" exitCode=0 Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.642543 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1c7c-account-create-update-dmjp8" event={"ID":"e11aaedb-75a3-4da5-b86e-b358bd041d4e","Type":"ContainerDied","Data":"fecf3f6e6d9a0db6f9fe64d641b47f8c3c68938466b6dd0098465b48c3b548ae"} Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.642572 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1c7c-account-create-update-dmjp8" event={"ID":"e11aaedb-75a3-4da5-b86e-b358bd041d4e","Type":"ContainerStarted","Data":"ba99011d0d6ae67a0ced458323a994bafacf8c81ac1189e073e73b675c0dd028"} Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.644884 4895 generic.go:334] "Generic (PLEG): container finished" podID="4f267667-a26a-45c5-ab52-4d3b50b3ad17" containerID="3725a56d05f97712d364b9fb0fe73ef743ce2be2d199c44a30baf4677c6018ab" exitCode=0 Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.644911 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qjtt9" event={"ID":"4f267667-a26a-45c5-ab52-4d3b50b3ad17","Type":"ContainerDied","Data":"3725a56d05f97712d364b9fb0fe73ef743ce2be2d199c44a30baf4677c6018ab"} Dec 02 09:08:38 crc kubenswrapper[4895]: I1202 09:08:38.644926 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qjtt9" event={"ID":"4f267667-a26a-45c5-ab52-4d3b50b3ad17","Type":"ContainerStarted","Data":"25919bd8e73cff37b18ed5d90bdbd64b5a2705f795a9304d76545c81d9615299"} Dec 02 09:08:39 crc kubenswrapper[4895]: I1202 09:08:39.155681 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013e486c-9468-4687-b255-b9492896f50b" path="/var/lib/kubelet/pods/013e486c-9468-4687-b255-b9492896f50b/volumes" Dec 02 09:08:39 crc kubenswrapper[4895]: I1202 09:08:39.156392 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62314e04-2f0b-4cea-a952-aec25fc0799b" path="/var/lib/kubelet/pods/62314e04-2f0b-4cea-a952-aec25fc0799b/volumes" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.250153 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.258892 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.440945 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts\") pod \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.441288 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfv4w\" (UniqueName: \"kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w\") pod \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\" (UID: \"e11aaedb-75a3-4da5-b86e-b358bd041d4e\") " Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.441555 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts\") pod \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.441575 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e11aaedb-75a3-4da5-b86e-b358bd041d4e" (UID: "e11aaedb-75a3-4da5-b86e-b358bd041d4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.441708 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w75rw\" (UniqueName: \"kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw\") pod \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\" (UID: \"4f267667-a26a-45c5-ab52-4d3b50b3ad17\") " Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.442216 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f267667-a26a-45c5-ab52-4d3b50b3ad17" (UID: "4f267667-a26a-45c5-ab52-4d3b50b3ad17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.442817 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f267667-a26a-45c5-ab52-4d3b50b3ad17-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.442892 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e11aaedb-75a3-4da5-b86e-b358bd041d4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.448997 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w" (OuterVolumeSpecName: "kube-api-access-mfv4w") pod "e11aaedb-75a3-4da5-b86e-b358bd041d4e" (UID: "e11aaedb-75a3-4da5-b86e-b358bd041d4e"). InnerVolumeSpecName "kube-api-access-mfv4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.455754 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw" (OuterVolumeSpecName: "kube-api-access-w75rw") pod "4f267667-a26a-45c5-ab52-4d3b50b3ad17" (UID: "4f267667-a26a-45c5-ab52-4d3b50b3ad17"). InnerVolumeSpecName "kube-api-access-w75rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.546612 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfv4w\" (UniqueName: \"kubernetes.io/projected/e11aaedb-75a3-4da5-b86e-b358bd041d4e-kube-api-access-mfv4w\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.546667 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w75rw\" (UniqueName: \"kubernetes.io/projected/4f267667-a26a-45c5-ab52-4d3b50b3ad17-kube-api-access-w75rw\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.664611 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1c7c-account-create-update-dmjp8" event={"ID":"e11aaedb-75a3-4da5-b86e-b358bd041d4e","Type":"ContainerDied","Data":"ba99011d0d6ae67a0ced458323a994bafacf8c81ac1189e073e73b675c0dd028"} Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.664659 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba99011d0d6ae67a0ced458323a994bafacf8c81ac1189e073e73b675c0dd028" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.664687 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1c7c-account-create-update-dmjp8" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.666992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-qjtt9" event={"ID":"4f267667-a26a-45c5-ab52-4d3b50b3ad17","Type":"ContainerDied","Data":"25919bd8e73cff37b18ed5d90bdbd64b5a2705f795a9304d76545c81d9615299"} Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.667224 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25919bd8e73cff37b18ed5d90bdbd64b5a2705f795a9304d76545c81d9615299" Dec 02 09:08:40 crc kubenswrapper[4895]: I1202 09:08:40.667031 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-qjtt9" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.253085 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-dwb4g"] Dec 02 09:08:42 crc kubenswrapper[4895]: E1202 09:08:42.253981 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f267667-a26a-45c5-ab52-4d3b50b3ad17" containerName="mariadb-database-create" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.253999 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f267667-a26a-45c5-ab52-4d3b50b3ad17" containerName="mariadb-database-create" Dec 02 09:08:42 crc kubenswrapper[4895]: E1202 09:08:42.254046 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11aaedb-75a3-4da5-b86e-b358bd041d4e" containerName="mariadb-account-create-update" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.254054 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11aaedb-75a3-4da5-b86e-b358bd041d4e" containerName="mariadb-account-create-update" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.254333 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f267667-a26a-45c5-ab52-4d3b50b3ad17" containerName="mariadb-database-create" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.254353 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11aaedb-75a3-4da5-b86e-b358bd041d4e" containerName="mariadb-account-create-update" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.255389 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.258484 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.258531 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.259973 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.270527 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-dwb4g"] Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.270995 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hdr56" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.386083 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.386515 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.386610 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.387426 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bdd\" (UniqueName: \"kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.489058 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bdd\" (UniqueName: \"kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.489185 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.489313 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.489345 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.494244 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.502977 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.503640 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.511394 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bdd\" (UniqueName: \"kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd\") pod \"aodh-db-sync-dwb4g\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:42 crc kubenswrapper[4895]: I1202 09:08:42.576031 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:43 crc kubenswrapper[4895]: I1202 09:08:43.082439 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-dwb4g"] Dec 02 09:08:43 crc kubenswrapper[4895]: I1202 09:08:43.724257 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dwb4g" event={"ID":"2734fba4-4ac9-425b-8c0e-68702868de3e","Type":"ContainerStarted","Data":"afcac71477d6e0e0fce1430014e03196a13d4b22dad67d4a250880137db872de"} Dec 02 09:08:45 crc kubenswrapper[4895]: I1202 09:08:45.029358 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tqxqk"] Dec 02 09:08:45 crc kubenswrapper[4895]: I1202 09:08:45.042011 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tqxqk"] Dec 02 09:08:45 crc kubenswrapper[4895]: I1202 09:08:45.159111 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce80648-ccb2-4ba8-802c-c8afafb13ab6" path="/var/lib/kubelet/pods/cce80648-ccb2-4ba8-802c-c8afafb13ab6/volumes" Dec 02 09:08:48 crc kubenswrapper[4895]: I1202 09:08:48.778230 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dwb4g" event={"ID":"2734fba4-4ac9-425b-8c0e-68702868de3e","Type":"ContainerStarted","Data":"947435df6cfbae4ded1a38aee306494fed9102fb6231d255e5fe692ddac36119"} Dec 02 09:08:48 crc kubenswrapper[4895]: I1202 09:08:48.809352 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-dwb4g" podStartSLOduration=1.867114366 podStartE2EDuration="6.809330998s" podCreationTimestamp="2025-12-02 09:08:42 +0000 UTC" firstStartedPulling="2025-12-02 09:08:43.098436485 +0000 UTC m=+6334.269296108" lastFinishedPulling="2025-12-02 09:08:48.040653127 +0000 UTC m=+6339.211512740" observedRunningTime="2025-12-02 09:08:48.799231403 +0000 UTC m=+6339.970091016" watchObservedRunningTime="2025-12-02 09:08:48.809330998 +0000 UTC m=+6339.980190601" Dec 02 09:08:51 crc kubenswrapper[4895]: I1202 09:08:51.833660 4895 generic.go:334] "Generic (PLEG): container finished" podID="2734fba4-4ac9-425b-8c0e-68702868de3e" containerID="947435df6cfbae4ded1a38aee306494fed9102fb6231d255e5fe692ddac36119" exitCode=0 Dec 02 09:08:51 crc kubenswrapper[4895]: I1202 09:08:51.834645 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dwb4g" event={"ID":"2734fba4-4ac9-425b-8c0e-68702868de3e","Type":"ContainerDied","Data":"947435df6cfbae4ded1a38aee306494fed9102fb6231d255e5fe692ddac36119"} Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.308610 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.433704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle\") pod \"2734fba4-4ac9-425b-8c0e-68702868de3e\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.433834 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts\") pod \"2734fba4-4ac9-425b-8c0e-68702868de3e\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.434063 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bdd\" (UniqueName: \"kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd\") pod \"2734fba4-4ac9-425b-8c0e-68702868de3e\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.434197 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data\") pod \"2734fba4-4ac9-425b-8c0e-68702868de3e\" (UID: \"2734fba4-4ac9-425b-8c0e-68702868de3e\") " Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.442307 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts" (OuterVolumeSpecName: "scripts") pod "2734fba4-4ac9-425b-8c0e-68702868de3e" (UID: "2734fba4-4ac9-425b-8c0e-68702868de3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.442441 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd" (OuterVolumeSpecName: "kube-api-access-t8bdd") pod "2734fba4-4ac9-425b-8c0e-68702868de3e" (UID: "2734fba4-4ac9-425b-8c0e-68702868de3e"). InnerVolumeSpecName "kube-api-access-t8bdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.468522 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data" (OuterVolumeSpecName: "config-data") pod "2734fba4-4ac9-425b-8c0e-68702868de3e" (UID: "2734fba4-4ac9-425b-8c0e-68702868de3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.482130 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2734fba4-4ac9-425b-8c0e-68702868de3e" (UID: "2734fba4-4ac9-425b-8c0e-68702868de3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.539048 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.539221 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.539327 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bdd\" (UniqueName: \"kubernetes.io/projected/2734fba4-4ac9-425b-8c0e-68702868de3e-kube-api-access-t8bdd\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.539407 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2734fba4-4ac9-425b-8c0e-68702868de3e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.864064 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-dwb4g" event={"ID":"2734fba4-4ac9-425b-8c0e-68702868de3e","Type":"ContainerDied","Data":"afcac71477d6e0e0fce1430014e03196a13d4b22dad67d4a250880137db872de"} Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.864129 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afcac71477d6e0e0fce1430014e03196a13d4b22dad67d4a250880137db872de" Dec 02 09:08:53 crc kubenswrapper[4895]: I1202 09:08:53.864529 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-dwb4g" Dec 02 09:08:54 crc kubenswrapper[4895]: I1202 09:08:54.320259 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.859227 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 02 09:08:56 crc kubenswrapper[4895]: E1202 09:08:56.860581 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2734fba4-4ac9-425b-8c0e-68702868de3e" containerName="aodh-db-sync" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.860598 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2734fba4-4ac9-425b-8c0e-68702868de3e" containerName="aodh-db-sync" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.860880 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2734fba4-4ac9-425b-8c0e-68702868de3e" containerName="aodh-db-sync" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.864653 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.869186 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.869426 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.870052 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-hdr56" Dec 02 09:08:56 crc kubenswrapper[4895]: I1202 09:08:56.889212 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.010477 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.011252 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-scripts\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.011310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-config-data\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.011373 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqznk\" (UniqueName: \"kubernetes.io/projected/b3bd61e8-c5af-493e-b789-d517f04a8f70-kube-api-access-gqznk\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.113913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.114015 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-scripts\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.114047 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-config-data\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.114100 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqznk\" (UniqueName: \"kubernetes.io/projected/b3bd61e8-c5af-493e-b789-d517f04a8f70-kube-api-access-gqznk\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.120503 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-scripts\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.122046 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.125423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3bd61e8-c5af-493e-b789-d517f04a8f70-config-data\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.134247 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqznk\" (UniqueName: \"kubernetes.io/projected/b3bd61e8-c5af-493e-b789-d517f04a8f70-kube-api-access-gqznk\") pod \"aodh-0\" (UID: \"b3bd61e8-c5af-493e-b789-d517f04a8f70\") " pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.204318 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.810220 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 02 09:08:57 crc kubenswrapper[4895]: I1202 09:08:57.915438 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3bd61e8-c5af-493e-b789-d517f04a8f70","Type":"ContainerStarted","Data":"6d64ff5dfdae1a0a56f222db4afdfa5e7c48cdbfedeef61006f4605278e46ac2"} Dec 02 09:08:58 crc kubenswrapper[4895]: I1202 09:08:58.928095 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3bd61e8-c5af-493e-b789-d517f04a8f70","Type":"ContainerStarted","Data":"fcd5a521115803587cd8c6c251e51567fdfa920aac30597f086fdeb0f85f55eb"} Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.166961 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.167229 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-central-agent" containerID="cri-o://451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849" gracePeriod=30 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.167383 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-notification-agent" containerID="cri-o://cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425" gracePeriod=30 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.167409 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="proxy-httpd" containerID="cri-o://06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae" gracePeriod=30 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.167379 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="sg-core" containerID="cri-o://54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea" gracePeriod=30 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.941820 4895 generic.go:334] "Generic (PLEG): container finished" podID="e91bcda3-165e-4653-8466-a55929cd079a" containerID="06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae" exitCode=0 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.942188 4895 generic.go:334] "Generic (PLEG): container finished" podID="e91bcda3-165e-4653-8466-a55929cd079a" containerID="54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea" exitCode=2 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.942206 4895 generic.go:334] "Generic (PLEG): container finished" podID="e91bcda3-165e-4653-8466-a55929cd079a" containerID="451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849" exitCode=0 Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.942237 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerDied","Data":"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae"} Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.942263 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerDied","Data":"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea"} Dec 02 09:08:59 crc kubenswrapper[4895]: I1202 09:08:59.942274 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerDied","Data":"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849"} Dec 02 09:09:00 crc kubenswrapper[4895]: I1202 09:09:00.958126 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3bd61e8-c5af-493e-b789-d517f04a8f70","Type":"ContainerStarted","Data":"febe1efb806fcfe7092ea11ffd6620d8201c8c866f3962428631ae1087608ccb"} Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.914134 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943148 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pklvx\" (UniqueName: \"kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943202 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943252 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943433 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943459 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943813 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.943861 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts\") pod \"e91bcda3-165e-4653-8466-a55929cd079a\" (UID: \"e91bcda3-165e-4653-8466-a55929cd079a\") " Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.946986 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.947386 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.957708 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts" (OuterVolumeSpecName: "scripts") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.957844 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx" (OuterVolumeSpecName: "kube-api-access-pklvx") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "kube-api-access-pklvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.981934 4895 generic.go:334] "Generic (PLEG): container finished" podID="e91bcda3-165e-4653-8466-a55929cd079a" containerID="cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425" exitCode=0 Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.981980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerDied","Data":"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425"} Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.981998 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.982021 4895 scope.go:117] "RemoveContainer" containerID="06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae" Dec 02 09:09:01 crc kubenswrapper[4895]: I1202 09:09:01.982008 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e91bcda3-165e-4653-8466-a55929cd079a","Type":"ContainerDied","Data":"7aa781eb24e4fb5fbd05cb97a24560bddf34c98f910641405b1eba1f6a82a735"} Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.008956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.037044 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046353 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046447 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046461 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e91bcda3-165e-4653-8466-a55929cd079a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046476 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046489 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pklvx\" (UniqueName: \"kubernetes.io/projected/e91bcda3-165e-4653-8466-a55929cd079a-kube-api-access-pklvx\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.046500 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.078298 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data" (OuterVolumeSpecName: "config-data") pod "e91bcda3-165e-4653-8466-a55929cd079a" (UID: "e91bcda3-165e-4653-8466-a55929cd079a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.123871 4895 scope.go:117] "RemoveContainer" containerID="54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.148379 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e91bcda3-165e-4653-8466-a55929cd079a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.148453 4895 scope.go:117] "RemoveContainer" containerID="cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.171570 4895 scope.go:117] "RemoveContainer" containerID="451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.198417 4895 scope.go:117] "RemoveContainer" containerID="06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.198982 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae\": container with ID starting with 06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae not found: ID does not exist" containerID="06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.199024 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae"} err="failed to get container status \"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae\": rpc error: code = NotFound desc = could not find container \"06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae\": container with ID starting with 06dc04f681f086bb8f9528809d20c5110e3451af6c9f6efc6ff0ecefee6bb7ae not found: ID does not exist" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.199047 4895 scope.go:117] "RemoveContainer" containerID="54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.199641 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea\": container with ID starting with 54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea not found: ID does not exist" containerID="54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.199832 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea"} err="failed to get container status \"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea\": rpc error: code = NotFound desc = could not find container \"54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea\": container with ID starting with 54a6e0f44224d3b9916a4d30b4630346dbb6d98d3ffb00cc224e4b670fdb75ea not found: ID does not exist" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.200009 4895 scope.go:117] "RemoveContainer" containerID="cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.200475 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425\": container with ID starting with cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425 not found: ID does not exist" containerID="cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.200593 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425"} err="failed to get container status \"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425\": rpc error: code = NotFound desc = could not find container \"cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425\": container with ID starting with cc4746411154f091a19dfb1df18109972fc932ef7b719f505a7f49259d18d425 not found: ID does not exist" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.200681 4895 scope.go:117] "RemoveContainer" containerID="451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.201149 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849\": container with ID starting with 451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849 not found: ID does not exist" containerID="451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.201264 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849"} err="failed to get container status \"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849\": rpc error: code = NotFound desc = could not find container \"451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849\": container with ID starting with 451ce9f03a5aad85acf4917cca261997e73467037de862bb240e32a24695c849 not found: ID does not exist" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.473332 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.494774 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.502724 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.503243 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="proxy-httpd" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503262 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="proxy-httpd" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.503284 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-notification-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503292 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-notification-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.503302 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="sg-core" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503309 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="sg-core" Dec 02 09:09:02 crc kubenswrapper[4895]: E1202 09:09:02.503335 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-central-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503341 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-central-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503530 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-notification-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503553 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="ceilometer-central-agent" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503569 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="proxy-httpd" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.503581 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e91bcda3-165e-4653-8466-a55929cd079a" containerName="sg-core" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.505857 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.510318 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.511449 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.512655 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.665908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666058 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666099 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666130 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wchz\" (UniqueName: \"kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.666313 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769071 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769514 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769563 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769606 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769718 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wchz\" (UniqueName: \"kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769768 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.769808 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.771812 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.773336 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.775451 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.775904 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.777512 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.788929 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.791106 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wchz\" (UniqueName: \"kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz\") pod \"ceilometer-0\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " pod="openstack/ceilometer-0" Dec 02 09:09:02 crc kubenswrapper[4895]: I1202 09:09:02.868585 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:03 crc kubenswrapper[4895]: I1202 09:09:03.156116 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e91bcda3-165e-4653-8466-a55929cd079a" path="/var/lib/kubelet/pods/e91bcda3-165e-4653-8466-a55929cd079a/volumes" Dec 02 09:09:03 crc kubenswrapper[4895]: I1202 09:09:03.387886 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:03 crc kubenswrapper[4895]: W1202 09:09:03.399725 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef63dd40_d042_4334_9fab_cdc72afccbb4.slice/crio-0d3235afeaf6c6134d87160920adf7820f1fe965df01de049c4b03345e9de38b WatchSource:0}: Error finding container 0d3235afeaf6c6134d87160920adf7820f1fe965df01de049c4b03345e9de38b: Status 404 returned error can't find the container with id 0d3235afeaf6c6134d87160920adf7820f1fe965df01de049c4b03345e9de38b Dec 02 09:09:04 crc kubenswrapper[4895]: I1202 09:09:04.132039 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerStarted","Data":"0d3235afeaf6c6134d87160920adf7820f1fe965df01de049c4b03345e9de38b"} Dec 02 09:09:05 crc kubenswrapper[4895]: I1202 09:09:05.154156 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerStarted","Data":"a13a240e587902c6d288be4491d5210ce87c3b9478fcd34b69bde2ae41655350"} Dec 02 09:09:05 crc kubenswrapper[4895]: I1202 09:09:05.473717 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:09:05 crc kubenswrapper[4895]: I1202 09:09:05.474167 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:09:06 crc kubenswrapper[4895]: I1202 09:09:06.170005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3bd61e8-c5af-493e-b789-d517f04a8f70","Type":"ContainerStarted","Data":"53e06067965cb69b15acbd6705603c4599cdae5a4eab8d9d69b46c9cdccbccc0"} Dec 02 09:09:06 crc kubenswrapper[4895]: I1202 09:09:06.174005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerStarted","Data":"c8026d8de11da84d53b750e9f8ab511b35a0f07e9ce773f7ebfe18f487507271"} Dec 02 09:09:07 crc kubenswrapper[4895]: I1202 09:09:07.186898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerStarted","Data":"5368a2dc678174c38465f71f6d220df741d4442c0fb4e41d6cb34f80af141160"} Dec 02 09:09:09 crc kubenswrapper[4895]: I1202 09:09:09.211666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerStarted","Data":"aebe3172a052fcffa26da96fce0a7aaa27e0ebc7ceec9bf8a1e40f344dcf6298"} Dec 02 09:09:09 crc kubenswrapper[4895]: I1202 09:09:09.214407 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:09:09 crc kubenswrapper[4895]: I1202 09:09:09.222068 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b3bd61e8-c5af-493e-b789-d517f04a8f70","Type":"ContainerStarted","Data":"2e6af45a3dfdfe5acb51697b6e554b40c230feb806ff5d47ea075e07ff2c5d4a"} Dec 02 09:09:09 crc kubenswrapper[4895]: I1202 09:09:09.247391 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.200701501 podStartE2EDuration="7.247369496s" podCreationTimestamp="2025-12-02 09:09:02 +0000 UTC" firstStartedPulling="2025-12-02 09:09:03.403251344 +0000 UTC m=+6354.574110957" lastFinishedPulling="2025-12-02 09:09:08.449919329 +0000 UTC m=+6359.620778952" observedRunningTime="2025-12-02 09:09:09.243536527 +0000 UTC m=+6360.414396140" watchObservedRunningTime="2025-12-02 09:09:09.247369496 +0000 UTC m=+6360.418229109" Dec 02 09:09:09 crc kubenswrapper[4895]: I1202 09:09:09.276598 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.652494712 podStartE2EDuration="13.276567925s" podCreationTimestamp="2025-12-02 09:08:56 +0000 UTC" firstStartedPulling="2025-12-02 09:08:57.822881014 +0000 UTC m=+6348.993740627" lastFinishedPulling="2025-12-02 09:09:08.446954227 +0000 UTC m=+6359.617813840" observedRunningTime="2025-12-02 09:09:09.271121626 +0000 UTC m=+6360.441981249" watchObservedRunningTime="2025-12-02 09:09:09.276567925 +0000 UTC m=+6360.447427538" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.000699 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-5htr7"] Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.002997 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.012259 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5htr7"] Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.079289 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.079497 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsjp\" (UniqueName: \"kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.182611 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.182875 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsjp\" (UniqueName: \"kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.183734 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.209021 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsjp\" (UniqueName: \"kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp\") pod \"manila-db-create-5htr7\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.221738 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-90d7-account-create-update-8786c"] Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.223951 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.227085 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.241373 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-90d7-account-create-update-8786c"] Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.285972 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.286968 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmvh\" (UniqueName: \"kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.328250 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5htr7" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.388955 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmvh\" (UniqueName: \"kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.389063 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.390273 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.411831 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmvh\" (UniqueName: \"kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh\") pod \"manila-90d7-account-create-update-8786c\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.593689 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:15 crc kubenswrapper[4895]: I1202 09:09:15.966533 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5htr7"] Dec 02 09:09:16 crc kubenswrapper[4895]: W1202 09:09:16.032099 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c628159_a2bd_4d35_b4a5_9d9e1e588259.slice/crio-652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613 WatchSource:0}: Error finding container 652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613: Status 404 returned error can't find the container with id 652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613 Dec 02 09:09:16 crc kubenswrapper[4895]: I1202 09:09:16.396347 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-90d7-account-create-update-8786c" event={"ID":"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1","Type":"ContainerStarted","Data":"272a5db5461036ff8a10d7de7071e584f48f4de8ee8ce06dcd5ef47135023ed6"} Dec 02 09:09:16 crc kubenswrapper[4895]: I1202 09:09:16.398255 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5htr7" event={"ID":"5c628159-a2bd-4d35-b4a5-9d9e1e588259","Type":"ContainerStarted","Data":"903e854846e2e475f2fdfc57d3ee2aa0cab38c1cf3d4137e7bed182ae0077a3c"} Dec 02 09:09:16 crc kubenswrapper[4895]: I1202 09:09:16.398290 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5htr7" event={"ID":"5c628159-a2bd-4d35-b4a5-9d9e1e588259","Type":"ContainerStarted","Data":"652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613"} Dec 02 09:09:16 crc kubenswrapper[4895]: I1202 09:09:16.403013 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-90d7-account-create-update-8786c"] Dec 02 09:09:16 crc kubenswrapper[4895]: I1202 09:09:16.419327 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-5htr7" podStartSLOduration=2.419299936 podStartE2EDuration="2.419299936s" podCreationTimestamp="2025-12-02 09:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:09:16.418327495 +0000 UTC m=+6367.589187118" watchObservedRunningTime="2025-12-02 09:09:16.419299936 +0000 UTC m=+6367.590159549" Dec 02 09:09:17 crc kubenswrapper[4895]: I1202 09:09:17.414765 4895 generic.go:334] "Generic (PLEG): container finished" podID="fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" containerID="dd7f74a0de2f532d393eb84b88aca17b7252df402340db56ac66ea3ef532630a" exitCode=0 Dec 02 09:09:17 crc kubenswrapper[4895]: I1202 09:09:17.414870 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-90d7-account-create-update-8786c" event={"ID":"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1","Type":"ContainerDied","Data":"dd7f74a0de2f532d393eb84b88aca17b7252df402340db56ac66ea3ef532630a"} Dec 02 09:09:17 crc kubenswrapper[4895]: I1202 09:09:17.419466 4895 generic.go:334] "Generic (PLEG): container finished" podID="5c628159-a2bd-4d35-b4a5-9d9e1e588259" containerID="903e854846e2e475f2fdfc57d3ee2aa0cab38c1cf3d4137e7bed182ae0077a3c" exitCode=0 Dec 02 09:09:17 crc kubenswrapper[4895]: I1202 09:09:17.419548 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5htr7" event={"ID":"5c628159-a2bd-4d35-b4a5-9d9e1e588259","Type":"ContainerDied","Data":"903e854846e2e475f2fdfc57d3ee2aa0cab38c1cf3d4137e7bed182ae0077a3c"} Dec 02 09:09:18 crc kubenswrapper[4895]: I1202 09:09:18.918152 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:18 crc kubenswrapper[4895]: I1202 09:09:18.927049 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5htr7" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.198418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdsjp\" (UniqueName: \"kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp\") pod \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.198561 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts\") pod \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\" (UID: \"5c628159-a2bd-4d35-b4a5-9d9e1e588259\") " Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.198654 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmvh\" (UniqueName: \"kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh\") pod \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.198706 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts\") pod \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\" (UID: \"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1\") " Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.200324 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" (UID: "fc12c784-6db5-424b-b4e9-a7c0ce42ebb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.200846 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c628159-a2bd-4d35-b4a5-9d9e1e588259" (UID: "5c628159-a2bd-4d35-b4a5-9d9e1e588259"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.206279 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp" (OuterVolumeSpecName: "kube-api-access-sdsjp") pod "5c628159-a2bd-4d35-b4a5-9d9e1e588259" (UID: "5c628159-a2bd-4d35-b4a5-9d9e1e588259"). InnerVolumeSpecName "kube-api-access-sdsjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.208973 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh" (OuterVolumeSpecName: "kube-api-access-6hmvh") pod "fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" (UID: "fc12c784-6db5-424b-b4e9-a7c0ce42ebb1"). InnerVolumeSpecName "kube-api-access-6hmvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.301085 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmvh\" (UniqueName: \"kubernetes.io/projected/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-kube-api-access-6hmvh\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.301124 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.301140 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdsjp\" (UniqueName: \"kubernetes.io/projected/5c628159-a2bd-4d35-b4a5-9d9e1e588259-kube-api-access-sdsjp\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.301153 4895 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c628159-a2bd-4d35-b4a5-9d9e1e588259-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.441188 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5htr7" event={"ID":"5c628159-a2bd-4d35-b4a5-9d9e1e588259","Type":"ContainerDied","Data":"652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613"} Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.441234 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="652b0763c0bf15eea2e5fb99ccd593b981a19db6e85351ed15c3e9cf804ae613" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.441251 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5htr7" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.443542 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-90d7-account-create-update-8786c" event={"ID":"fc12c784-6db5-424b-b4e9-a7c0ce42ebb1","Type":"ContainerDied","Data":"272a5db5461036ff8a10d7de7071e584f48f4de8ee8ce06dcd5ef47135023ed6"} Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.443568 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272a5db5461036ff8a10d7de7071e584f48f4de8ee8ce06dcd5ef47135023ed6" Dec 02 09:09:19 crc kubenswrapper[4895]: I1202 09:09:19.443591 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-90d7-account-create-update-8786c" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.509249 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-f55lv"] Dec 02 09:09:20 crc kubenswrapper[4895]: E1202 09:09:20.511627 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c628159-a2bd-4d35-b4a5-9d9e1e588259" containerName="mariadb-database-create" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.511686 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c628159-a2bd-4d35-b4a5-9d9e1e588259" containerName="mariadb-database-create" Dec 02 09:09:20 crc kubenswrapper[4895]: E1202 09:09:20.511726 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" containerName="mariadb-account-create-update" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.511735 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" containerName="mariadb-account-create-update" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.512018 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" containerName="mariadb-account-create-update" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.512056 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c628159-a2bd-4d35-b4a5-9d9e1e588259" containerName="mariadb-database-create" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.513255 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.516104 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-lzltg" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.516549 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.629120 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.629252 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skcsq\" (UniqueName: \"kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.629292 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.629400 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.629554 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-f55lv"] Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.844024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.844399 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skcsq\" (UniqueName: \"kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.844470 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.844628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.857629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.858785 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.859602 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:20 crc kubenswrapper[4895]: I1202 09:09:20.883570 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skcsq\" (UniqueName: \"kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq\") pod \"manila-db-sync-f55lv\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:21 crc kubenswrapper[4895]: I1202 09:09:21.147645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:22 crc kubenswrapper[4895]: I1202 09:09:22.081385 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-f55lv"] Dec 02 09:09:22 crc kubenswrapper[4895]: W1202 09:09:22.089701 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod780ef2e7_bdb3_4c45_98bc_64659b5a19a6.slice/crio-01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10 WatchSource:0}: Error finding container 01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10: Status 404 returned error can't find the container with id 01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10 Dec 02 09:09:22 crc kubenswrapper[4895]: I1202 09:09:22.094168 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:09:22 crc kubenswrapper[4895]: I1202 09:09:22.476719 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-f55lv" event={"ID":"780ef2e7-bdb3-4c45-98bc-64659b5a19a6","Type":"ContainerStarted","Data":"01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10"} Dec 02 09:09:28 crc kubenswrapper[4895]: I1202 09:09:28.565134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-f55lv" event={"ID":"780ef2e7-bdb3-4c45-98bc-64659b5a19a6","Type":"ContainerStarted","Data":"db39de8422dbc6f6c72e457cdc99b283ab01473e32740b50d8ef0608dee5bbd2"} Dec 02 09:09:28 crc kubenswrapper[4895]: I1202 09:09:28.590351 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-f55lv" podStartSLOduration=3.652849324 podStartE2EDuration="8.590325379s" podCreationTimestamp="2025-12-02 09:09:20 +0000 UTC" firstStartedPulling="2025-12-02 09:09:22.093853749 +0000 UTC m=+6373.264713362" lastFinishedPulling="2025-12-02 09:09:27.031329804 +0000 UTC m=+6378.202189417" observedRunningTime="2025-12-02 09:09:28.582457674 +0000 UTC m=+6379.753317287" watchObservedRunningTime="2025-12-02 09:09:28.590325379 +0000 UTC m=+6379.761185012" Dec 02 09:09:29 crc kubenswrapper[4895]: I1202 09:09:29.575334 4895 generic.go:334] "Generic (PLEG): container finished" podID="780ef2e7-bdb3-4c45-98bc-64659b5a19a6" containerID="db39de8422dbc6f6c72e457cdc99b283ab01473e32740b50d8ef0608dee5bbd2" exitCode=0 Dec 02 09:09:29 crc kubenswrapper[4895]: I1202 09:09:29.575407 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-f55lv" event={"ID":"780ef2e7-bdb3-4c45-98bc-64659b5a19a6","Type":"ContainerDied","Data":"db39de8422dbc6f6c72e457cdc99b283ab01473e32740b50d8ef0608dee5bbd2"} Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.047160 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.166653 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data\") pod \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.166714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data\") pod \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.166983 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skcsq\" (UniqueName: \"kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq\") pod \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.167066 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle\") pod \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\" (UID: \"780ef2e7-bdb3-4c45-98bc-64659b5a19a6\") " Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.174303 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "780ef2e7-bdb3-4c45-98bc-64659b5a19a6" (UID: "780ef2e7-bdb3-4c45-98bc-64659b5a19a6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.174862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq" (OuterVolumeSpecName: "kube-api-access-skcsq") pod "780ef2e7-bdb3-4c45-98bc-64659b5a19a6" (UID: "780ef2e7-bdb3-4c45-98bc-64659b5a19a6"). InnerVolumeSpecName "kube-api-access-skcsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.183893 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data" (OuterVolumeSpecName: "config-data") pod "780ef2e7-bdb3-4c45-98bc-64659b5a19a6" (UID: "780ef2e7-bdb3-4c45-98bc-64659b5a19a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.204041 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "780ef2e7-bdb3-4c45-98bc-64659b5a19a6" (UID: "780ef2e7-bdb3-4c45-98bc-64659b5a19a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.270555 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.270586 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skcsq\" (UniqueName: \"kubernetes.io/projected/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-kube-api-access-skcsq\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.270613 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.270630 4895 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/780ef2e7-bdb3-4c45-98bc-64659b5a19a6-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.595974 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-f55lv" event={"ID":"780ef2e7-bdb3-4c45-98bc-64659b5a19a6","Type":"ContainerDied","Data":"01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10"} Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.596017 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01bddb76713f1bf018dc3cddf28970edadfe0a9a310cee0f7fab1a4258d0da10" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.596083 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-f55lv" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.846345 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 02 09:09:31 crc kubenswrapper[4895]: E1202 09:09:31.847256 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="780ef2e7-bdb3-4c45-98bc-64659b5a19a6" containerName="manila-db-sync" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.847280 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="780ef2e7-bdb3-4c45-98bc-64659b5a19a6" containerName="manila-db-sync" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.847594 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="780ef2e7-bdb3-4c45-98bc-64659b5a19a6" containerName="manila-db-sync" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.849116 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.853486 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-lzltg" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.853686 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.853895 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.855027 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.927498 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.938055 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.940427 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.944008 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 02 09:09:31 crc kubenswrapper[4895]: I1202 09:09:31.976533 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007146 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-scripts\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007229 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007280 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007336 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007381 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34e95488-e475-4fd5-94c7-43633883cc2b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.007418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rll7p\" (UniqueName: \"kubernetes.io/projected/34e95488-e475-4fd5-94c7-43633883cc2b-kube-api-access-rll7p\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.094185 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.101775 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.109440 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34e95488-e475-4fd5-94c7-43633883cc2b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.109531 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-ceph\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.109839 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/34e95488-e475-4fd5-94c7-43633883cc2b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.110689 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.110790 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rll7p\" (UniqueName: \"kubernetes.io/projected/34e95488-e475-4fd5-94c7-43633883cc2b-kube-api-access-rll7p\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.110919 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111080 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-scripts\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111219 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111263 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111299 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgcf\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-kube-api-access-nsgcf\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111339 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-scripts\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111421 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111453 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111594 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.111643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.121591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-scripts\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.121626 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.130855 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.131190 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34e95488-e475-4fd5-94c7-43633883cc2b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.135390 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.139982 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rll7p\" (UniqueName: \"kubernetes.io/projected/34e95488-e475-4fd5-94c7-43633883cc2b-kube-api-access-rll7p\") pod \"manila-scheduler-0\" (UID: \"34e95488-e475-4fd5-94c7-43633883cc2b\") " pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.215868 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.215917 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.215954 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216014 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-ceph\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216401 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216593 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ztq\" (UniqueName: \"kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216696 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgcf\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-kube-api-access-nsgcf\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216721 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-scripts\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216813 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216836 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.216951 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.217372 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.219999 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294cd801-c423-4a14-95c0-1ece400a3760-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.228523 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.232820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-ceph\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.235279 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.235812 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-scripts\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.236467 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.247238 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/294cd801-c423-4a14-95c0-1ece400a3760-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.256616 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgcf\" (UniqueName: \"kubernetes.io/projected/294cd801-c423-4a14-95c0-1ece400a3760-kube-api-access-nsgcf\") pod \"manila-share-share1-0\" (UID: \"294cd801-c423-4a14-95c0-1ece400a3760\") " pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.299115 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.308916 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.318679 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.318726 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.318773 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.318861 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ztq\" (UniqueName: \"kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.318925 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.319872 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.320131 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.322847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.323807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.324619 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.328210 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.335875 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.358347 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ztq\" (UniqueName: \"kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq\") pod \"dnsmasq-dns-5db76f6d45-bbthz\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.423453 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data-custom\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.441644 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.443074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/858d88d6-a1a5-49dd-90c9-c87d83cc992f-etc-machine-id\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.445346 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.445421 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.445523 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4xc9\" (UniqueName: \"kubernetes.io/projected/858d88d6-a1a5-49dd-90c9-c87d83cc992f-kube-api-access-z4xc9\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.445567 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-scripts\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.445804 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d88d6-a1a5-49dd-90c9-c87d83cc992f-logs\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.560713 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data-custom\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561194 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/858d88d6-a1a5-49dd-90c9-c87d83cc992f-etc-machine-id\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561209 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561242 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561292 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4xc9\" (UniqueName: \"kubernetes.io/projected/858d88d6-a1a5-49dd-90c9-c87d83cc992f-kube-api-access-z4xc9\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561317 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-scripts\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.561464 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d88d6-a1a5-49dd-90c9-c87d83cc992f-logs\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.569090 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/858d88d6-a1a5-49dd-90c9-c87d83cc992f-etc-machine-id\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.569211 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858d88d6-a1a5-49dd-90c9-c87d83cc992f-logs\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.571408 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.576151 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data-custom\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.576191 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-scripts\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.583482 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858d88d6-a1a5-49dd-90c9-c87d83cc992f-config-data\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.590070 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4xc9\" (UniqueName: \"kubernetes.io/projected/858d88d6-a1a5-49dd-90c9-c87d83cc992f-kube-api-access-z4xc9\") pod \"manila-api-0\" (UID: \"858d88d6-a1a5-49dd-90c9-c87d83cc992f\") " pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.747811 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.880318 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 02 09:09:32 crc kubenswrapper[4895]: I1202 09:09:32.886359 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 09:09:32 crc kubenswrapper[4895]: W1202 09:09:32.960648 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e95488_e475_4fd5_94c7_43633883cc2b.slice/crio-fad563566f644f439908995299d9dfc3524e0dcfdf9c2012c0ed79cd2bb35632 WatchSource:0}: Error finding container fad563566f644f439908995299d9dfc3524e0dcfdf9c2012c0ed79cd2bb35632: Status 404 returned error can't find the container with id fad563566f644f439908995299d9dfc3524e0dcfdf9c2012c0ed79cd2bb35632 Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.162705 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 02 09:09:33 crc kubenswrapper[4895]: W1202 09:09:33.178453 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31be70b7_4a75_4a4b_b0bc_51fdaedaf8c2.slice/crio-014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c WatchSource:0}: Error finding container 014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c: Status 404 returned error can't find the container with id 014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.179351 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.531772 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 02 09:09:33 crc kubenswrapper[4895]: W1202 09:09:33.543425 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858d88d6_a1a5_49dd_90c9_c87d83cc992f.slice/crio-65c63285c3a5802150befb0003a816a77f9ae71a7a50b4383b73e1aa8f017d5f WatchSource:0}: Error finding container 65c63285c3a5802150befb0003a816a77f9ae71a7a50b4383b73e1aa8f017d5f: Status 404 returned error can't find the container with id 65c63285c3a5802150befb0003a816a77f9ae71a7a50b4383b73e1aa8f017d5f Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.716150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" event={"ID":"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2","Type":"ContainerStarted","Data":"014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c"} Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.719619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"858d88d6-a1a5-49dd-90c9-c87d83cc992f","Type":"ContainerStarted","Data":"65c63285c3a5802150befb0003a816a77f9ae71a7a50b4383b73e1aa8f017d5f"} Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.721263 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"294cd801-c423-4a14-95c0-1ece400a3760","Type":"ContainerStarted","Data":"7dec693b47a56ca39af0e66d5e6d22312facb54c5bc3004b16c20b257cfcf22c"} Dec 02 09:09:33 crc kubenswrapper[4895]: I1202 09:09:33.743914 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34e95488-e475-4fd5-94c7-43633883cc2b","Type":"ContainerStarted","Data":"fad563566f644f439908995299d9dfc3524e0dcfdf9c2012c0ed79cd2bb35632"} Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.756497 4895 generic.go:334] "Generic (PLEG): container finished" podID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerID="d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4" exitCode=0 Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.756599 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" event={"ID":"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2","Type":"ContainerDied","Data":"d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4"} Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.778058 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"858d88d6-a1a5-49dd-90c9-c87d83cc992f","Type":"ContainerStarted","Data":"201380a2f6af00585105d432d9b323d4e41239f579e4e7d258c9208f2226b2f0"} Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.778114 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"858d88d6-a1a5-49dd-90c9-c87d83cc992f","Type":"ContainerStarted","Data":"d041e877574536063eb4e9ab6ff1636460479bfe2ad8ace5e2f32efbed6ec93f"} Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.778343 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 02 09:09:34 crc kubenswrapper[4895]: I1202 09:09:34.824029 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.824004148 podStartE2EDuration="2.824004148s" podCreationTimestamp="2025-12-02 09:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:09:34.80958259 +0000 UTC m=+6385.980442203" watchObservedRunningTime="2025-12-02 09:09:34.824004148 +0000 UTC m=+6385.994863761" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.473204 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.473816 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.473927 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.475318 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.475400 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" gracePeriod=600 Dec 02 09:09:35 crc kubenswrapper[4895]: E1202 09:09:35.631481 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.809401 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" exitCode=0 Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.809502 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444"} Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.809832 4895 scope.go:117] "RemoveContainer" containerID="3e66018704e5440a759c7db87d699ad813d9bb81de4b6aa004c7a6747bba333a" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.810638 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:09:35 crc kubenswrapper[4895]: E1202 09:09:35.811010 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.816594 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34e95488-e475-4fd5-94c7-43633883cc2b","Type":"ContainerStarted","Data":"02ecae21d7e358bb78c82096783168932c8c0b25cf2ea08dc3416a3c50bd109c"} Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.827441 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" event={"ID":"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2","Type":"ContainerStarted","Data":"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867"} Dec 02 09:09:35 crc kubenswrapper[4895]: I1202 09:09:35.915932 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" podStartSLOduration=3.9158428499999998 podStartE2EDuration="3.91584285s" podCreationTimestamp="2025-12-02 09:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:09:35.877579599 +0000 UTC m=+6387.048439212" watchObservedRunningTime="2025-12-02 09:09:35.91584285 +0000 UTC m=+6387.086702463" Dec 02 09:09:36 crc kubenswrapper[4895]: I1202 09:09:36.846586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"34e95488-e475-4fd5-94c7-43633883cc2b","Type":"ContainerStarted","Data":"f1926c0a5695f45058ec91d9f718539edd589b7d5d0b73a2fd3e8a868d4a4e52"} Dec 02 09:09:36 crc kubenswrapper[4895]: I1202 09:09:36.854308 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:36 crc kubenswrapper[4895]: I1202 09:09:36.882553 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.696182519 podStartE2EDuration="5.882521135s" podCreationTimestamp="2025-12-02 09:09:31 +0000 UTC" firstStartedPulling="2025-12-02 09:09:32.964785047 +0000 UTC m=+6384.135644660" lastFinishedPulling="2025-12-02 09:09:35.151123673 +0000 UTC m=+6386.321983276" observedRunningTime="2025-12-02 09:09:36.873076471 +0000 UTC m=+6388.043936104" watchObservedRunningTime="2025-12-02 09:09:36.882521135 +0000 UTC m=+6388.053380758" Dec 02 09:09:36 crc kubenswrapper[4895]: I1202 09:09:36.954925 4895 scope.go:117] "RemoveContainer" containerID="0ff688b4e620048852f79fafd7645496df6c308ceb4e6c2f160b19a9009ceb0f" Dec 02 09:09:37 crc kubenswrapper[4895]: I1202 09:09:37.343902 4895 scope.go:117] "RemoveContainer" containerID="8604e1e270b85c7e0f9829e1dc01f238dce35039d0d9ddbc8eced80a7bb02ce7" Dec 02 09:09:37 crc kubenswrapper[4895]: I1202 09:09:37.437149 4895 scope.go:117] "RemoveContainer" containerID="9da84d1a18ac0bb209fcd43fd55a4f2bac1dec8f7173fe1570df1a76df09070e" Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.230088 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.446231 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.531199 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.531500 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="dnsmasq-dns" containerID="cri-o://fef8dba04b04619b0c9ea61e2d8af31dcdc4a776814001ec9cc02f811b6ca453" gracePeriod=10 Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.957513 4895 generic.go:334] "Generic (PLEG): container finished" podID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerID="fef8dba04b04619b0c9ea61e2d8af31dcdc4a776814001ec9cc02f811b6ca453" exitCode=0 Dec 02 09:09:42 crc kubenswrapper[4895]: I1202 09:09:42.957803 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" event={"ID":"3627f865-5ee4-49f3-8b96-96cc1b94ec6e","Type":"ContainerDied","Data":"fef8dba04b04619b0c9ea61e2d8af31dcdc4a776814001ec9cc02f811b6ca453"} Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.415821 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.565838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb\") pod \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.565886 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc\") pod \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.566128 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config\") pod \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.566161 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwntn\" (UniqueName: \"kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn\") pod \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.566230 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb\") pod \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\" (UID: \"3627f865-5ee4-49f3-8b96-96cc1b94ec6e\") " Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.582491 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn" (OuterVolumeSpecName: "kube-api-access-jwntn") pod "3627f865-5ee4-49f3-8b96-96cc1b94ec6e" (UID: "3627f865-5ee4-49f3-8b96-96cc1b94ec6e"). InnerVolumeSpecName "kube-api-access-jwntn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.655904 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3627f865-5ee4-49f3-8b96-96cc1b94ec6e" (UID: "3627f865-5ee4-49f3-8b96-96cc1b94ec6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.667734 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config" (OuterVolumeSpecName: "config") pod "3627f865-5ee4-49f3-8b96-96cc1b94ec6e" (UID: "3627f865-5ee4-49f3-8b96-96cc1b94ec6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.669058 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.669096 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.669110 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwntn\" (UniqueName: \"kubernetes.io/projected/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-kube-api-access-jwntn\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.677851 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3627f865-5ee4-49f3-8b96-96cc1b94ec6e" (UID: "3627f865-5ee4-49f3-8b96-96cc1b94ec6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.687808 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3627f865-5ee4-49f3-8b96-96cc1b94ec6e" (UID: "3627f865-5ee4-49f3-8b96-96cc1b94ec6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.771055 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.771096 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3627f865-5ee4-49f3-8b96-96cc1b94ec6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.972774 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" event={"ID":"3627f865-5ee4-49f3-8b96-96cc1b94ec6e","Type":"ContainerDied","Data":"b0d5d346f62edb3673669a053519fdedadfbfcb21db8d8586e2b262de4c76987"} Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.973125 4895 scope.go:117] "RemoveContainer" containerID="fef8dba04b04619b0c9ea61e2d8af31dcdc4a776814001ec9cc02f811b6ca453" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.973301 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8844f56df-tmgd5" Dec 02 09:09:43 crc kubenswrapper[4895]: I1202 09:09:43.978277 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"294cd801-c423-4a14-95c0-1ece400a3760","Type":"ContainerStarted","Data":"ea2dbe426d2a57f88bdc3877d4960d56f9639489de506e251368398984e5c0b7"} Dec 02 09:09:44 crc kubenswrapper[4895]: I1202 09:09:44.061271 4895 scope.go:117] "RemoveContainer" containerID="291b2922a560b070518508ca07ae54035db019788cc315c3c32b0a88ba89aaa1" Dec 02 09:09:44 crc kubenswrapper[4895]: I1202 09:09:44.063224 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 09:09:44 crc kubenswrapper[4895]: I1202 09:09:44.086603 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8844f56df-tmgd5"] Dec 02 09:09:44 crc kubenswrapper[4895]: I1202 09:09:44.993098 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"294cd801-c423-4a14-95c0-1ece400a3760","Type":"ContainerStarted","Data":"d2cefa58c0692cc548d1641a3d74a18f1c093baa6a56c18a76c2b97388d97e40"} Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.018906 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.333986686 podStartE2EDuration="14.018882731s" podCreationTimestamp="2025-12-02 09:09:31 +0000 UTC" firstStartedPulling="2025-12-02 09:09:33.169309814 +0000 UTC m=+6384.340169427" lastFinishedPulling="2025-12-02 09:09:42.854205859 +0000 UTC m=+6394.025065472" observedRunningTime="2025-12-02 09:09:45.012602885 +0000 UTC m=+6396.183462518" watchObservedRunningTime="2025-12-02 09:09:45.018882731 +0000 UTC m=+6396.189742354" Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.157525 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" path="/var/lib/kubelet/pods/3627f865-5ee4-49f3-8b96-96cc1b94ec6e/volumes" Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.490602 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.490932 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-central-agent" containerID="cri-o://a13a240e587902c6d288be4491d5210ce87c3b9478fcd34b69bde2ae41655350" gracePeriod=30 Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.491067 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="sg-core" containerID="cri-o://5368a2dc678174c38465f71f6d220df741d4442c0fb4e41d6cb34f80af141160" gracePeriod=30 Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.491062 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="proxy-httpd" containerID="cri-o://aebe3172a052fcffa26da96fce0a7aaa27e0ebc7ceec9bf8a1e40f344dcf6298" gracePeriod=30 Dec 02 09:09:45 crc kubenswrapper[4895]: I1202 09:09:45.491110 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-notification-agent" containerID="cri-o://c8026d8de11da84d53b750e9f8ab511b35a0f07e9ce773f7ebfe18f487507271" gracePeriod=30 Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.008693 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerID="aebe3172a052fcffa26da96fce0a7aaa27e0ebc7ceec9bf8a1e40f344dcf6298" exitCode=0 Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.009219 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerID="5368a2dc678174c38465f71f6d220df741d4442c0fb4e41d6cb34f80af141160" exitCode=2 Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.009231 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerID="a13a240e587902c6d288be4491d5210ce87c3b9478fcd34b69bde2ae41655350" exitCode=0 Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.008794 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerDied","Data":"aebe3172a052fcffa26da96fce0a7aaa27e0ebc7ceec9bf8a1e40f344dcf6298"} Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.009369 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerDied","Data":"5368a2dc678174c38465f71f6d220df741d4442c0fb4e41d6cb34f80af141160"} Dec 02 09:09:46 crc kubenswrapper[4895]: I1202 09:09:46.009384 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerDied","Data":"a13a240e587902c6d288be4491d5210ce87c3b9478fcd34b69bde2ae41655350"} Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.084250 4895 generic.go:334] "Generic (PLEG): container finished" podID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerID="c8026d8de11da84d53b750e9f8ab511b35a0f07e9ce773f7ebfe18f487507271" exitCode=0 Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.084339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerDied","Data":"c8026d8de11da84d53b750e9f8ab511b35a0f07e9ce773f7ebfe18f487507271"} Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.140813 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:09:49 crc kubenswrapper[4895]: E1202 09:09:49.141588 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.425780 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.543676 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.543778 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.543820 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.543893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.543967 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wchz\" (UniqueName: \"kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.544024 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.544106 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd\") pod \"ef63dd40-d042-4334-9fab-cdc72afccbb4\" (UID: \"ef63dd40-d042-4334-9fab-cdc72afccbb4\") " Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.544727 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.545175 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.551189 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts" (OuterVolumeSpecName: "scripts") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.552141 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz" (OuterVolumeSpecName: "kube-api-access-4wchz") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "kube-api-access-4wchz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.611974 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.647457 4895 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.647495 4895 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.647505 4895 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef63dd40-d042-4334-9fab-cdc72afccbb4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.647516 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wchz\" (UniqueName: \"kubernetes.io/projected/ef63dd40-d042-4334-9fab-cdc72afccbb4-kube-api-access-4wchz\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.647533 4895 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.667582 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.716976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data" (OuterVolumeSpecName: "config-data") pod "ef63dd40-d042-4334-9fab-cdc72afccbb4" (UID: "ef63dd40-d042-4334-9fab-cdc72afccbb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.751439 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:49 crc kubenswrapper[4895]: I1202 09:09:49.751513 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef63dd40-d042-4334-9fab-cdc72afccbb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.096869 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef63dd40-d042-4334-9fab-cdc72afccbb4","Type":"ContainerDied","Data":"0d3235afeaf6c6134d87160920adf7820f1fe965df01de049c4b03345e9de38b"} Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.096932 4895 scope.go:117] "RemoveContainer" containerID="aebe3172a052fcffa26da96fce0a7aaa27e0ebc7ceec9bf8a1e40f344dcf6298" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.097007 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.155174 4895 scope.go:117] "RemoveContainer" containerID="5368a2dc678174c38465f71f6d220df741d4442c0fb4e41d6cb34f80af141160" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.175848 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.197923 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.209591 4895 scope.go:117] "RemoveContainer" containerID="c8026d8de11da84d53b750e9f8ab511b35a0f07e9ce773f7ebfe18f487507271" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.217413 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.218402 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="init" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.218529 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="init" Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.218604 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-central-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.218666 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-central-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.218802 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-notification-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.218941 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-notification-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.219028 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="dnsmasq-dns" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219079 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="dnsmasq-dns" Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.219137 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="sg-core" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219188 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="sg-core" Dec 02 09:09:50 crc kubenswrapper[4895]: E1202 09:09:50.219254 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="proxy-httpd" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219302 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="proxy-httpd" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219669 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="proxy-httpd" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219765 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-notification-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219862 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3627f865-5ee4-49f3-8b96-96cc1b94ec6e" containerName="dnsmasq-dns" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219937 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="ceilometer-central-agent" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.219993 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" containerName="sg-core" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.223453 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.230873 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.231704 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.232048 4895 scope.go:117] "RemoveContainer" containerID="a13a240e587902c6d288be4491d5210ce87c3b9478fcd34b69bde2ae41655350" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.241758 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.367200 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.367503 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-scripts\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.367618 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-config-data\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.367998 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6vg\" (UniqueName: \"kubernetes.io/projected/41c2235d-9bee-4e2d-878b-e6f1471a4078-kube-api-access-tz6vg\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.368057 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.368081 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-run-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.368244 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-log-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471431 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6vg\" (UniqueName: \"kubernetes.io/projected/41c2235d-9bee-4e2d-878b-e6f1471a4078-kube-api-access-tz6vg\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471508 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471550 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-run-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471636 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-log-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471674 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471805 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-scripts\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.471845 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-config-data\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.472709 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-run-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.472832 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41c2235d-9bee-4e2d-878b-e6f1471a4078-log-httpd\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.479083 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.481885 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-scripts\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.482338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.483684 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41c2235d-9bee-4e2d-878b-e6f1471a4078-config-data\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.495880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6vg\" (UniqueName: \"kubernetes.io/projected/41c2235d-9bee-4e2d-878b-e6f1471a4078-kube-api-access-tz6vg\") pod \"ceilometer-0\" (UID: \"41c2235d-9bee-4e2d-878b-e6f1471a4078\") " pod="openstack/ceilometer-0" Dec 02 09:09:50 crc kubenswrapper[4895]: I1202 09:09:50.547936 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 09:09:51 crc kubenswrapper[4895]: I1202 09:09:51.043214 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 09:09:51 crc kubenswrapper[4895]: I1202 09:09:51.111924 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c2235d-9bee-4e2d-878b-e6f1471a4078","Type":"ContainerStarted","Data":"008242b17e32c4527be5b27faa1effd1beb0b53c610546f04e8be2b50ac2fe81"} Dec 02 09:09:51 crc kubenswrapper[4895]: I1202 09:09:51.155435 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef63dd40-d042-4334-9fab-cdc72afccbb4" path="/var/lib/kubelet/pods/ef63dd40-d042-4334-9fab-cdc72afccbb4/volumes" Dec 02 09:09:52 crc kubenswrapper[4895]: I1202 09:09:52.161368 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c2235d-9bee-4e2d-878b-e6f1471a4078","Type":"ContainerStarted","Data":"30090b2ade4752b3b6d8e14d8111985d03a139ed19757c346e0c7379bede83b3"} Dec 02 09:09:52 crc kubenswrapper[4895]: I1202 09:09:52.303241 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 02 09:09:53 crc kubenswrapper[4895]: I1202 09:09:53.195956 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c2235d-9bee-4e2d-878b-e6f1471a4078","Type":"ContainerStarted","Data":"b9bf00c311ff897377b38535c3048325b78a4459a7ef0f7157e587b57165166a"} Dec 02 09:09:54 crc kubenswrapper[4895]: I1202 09:09:54.330287 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 02 09:09:54 crc kubenswrapper[4895]: I1202 09:09:54.413066 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 02 09:09:55 crc kubenswrapper[4895]: I1202 09:09:55.250059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c2235d-9bee-4e2d-878b-e6f1471a4078","Type":"ContainerStarted","Data":"9fb9b42789842af55037d49c5d248252e86648ea8d43bcc5bf7d6ffc1293f60d"} Dec 02 09:09:58 crc kubenswrapper[4895]: I1202 09:09:58.299656 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41c2235d-9bee-4e2d-878b-e6f1471a4078","Type":"ContainerStarted","Data":"8344f4464435a9bdddc4684904e7b5e01886f0039fa886453384c17b5ea7254f"} Dec 02 09:09:58 crc kubenswrapper[4895]: I1202 09:09:58.300284 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 09:09:58 crc kubenswrapper[4895]: I1202 09:09:58.332558 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.301088313 podStartE2EDuration="8.332531778s" podCreationTimestamp="2025-12-02 09:09:50 +0000 UTC" firstStartedPulling="2025-12-02 09:09:51.048184577 +0000 UTC m=+6402.219044190" lastFinishedPulling="2025-12-02 09:09:57.079628052 +0000 UTC m=+6408.250487655" observedRunningTime="2025-12-02 09:09:58.326339755 +0000 UTC m=+6409.497199368" watchObservedRunningTime="2025-12-02 09:09:58.332531778 +0000 UTC m=+6409.503391411" Dec 02 09:10:03 crc kubenswrapper[4895]: I1202 09:10:03.993502 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 02 09:10:04 crc kubenswrapper[4895]: I1202 09:10:04.140902 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:10:04 crc kubenswrapper[4895]: E1202 09:10:04.141136 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:10:16 crc kubenswrapper[4895]: I1202 09:10:16.141806 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:10:16 crc kubenswrapper[4895]: E1202 09:10:16.142624 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:10:20 crc kubenswrapper[4895]: I1202 09:10:20.554275 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 09:10:27 crc kubenswrapper[4895]: I1202 09:10:27.141115 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:10:27 crc kubenswrapper[4895]: E1202 09:10:27.142029 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:10:41 crc kubenswrapper[4895]: I1202 09:10:41.141586 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:10:41 crc kubenswrapper[4895]: E1202 09:10:41.142592 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.492291 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.497504 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.507882 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.508083 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.663276 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.663665 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.663726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.663781 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.663816 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.664183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxz9\" (UniqueName: \"kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766663 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766733 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766804 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766824 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766853 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.766899 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxz9\" (UniqueName: \"kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.768025 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.768032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.768828 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.769005 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.769081 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.795207 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxz9\" (UniqueName: \"kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9\") pod \"dnsmasq-dns-6cbfb99f9-67c58\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:47 crc kubenswrapper[4895]: I1202 09:10:47.837508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:48 crc kubenswrapper[4895]: I1202 09:10:48.335723 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:10:48 crc kubenswrapper[4895]: I1202 09:10:48.819090 4895 generic.go:334] "Generic (PLEG): container finished" podID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerID="390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe" exitCode=0 Dec 02 09:10:48 crc kubenswrapper[4895]: I1202 09:10:48.819209 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" event={"ID":"927f1ae3-c12f-42d2-9750-8d102628c18b","Type":"ContainerDied","Data":"390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe"} Dec 02 09:10:48 crc kubenswrapper[4895]: I1202 09:10:48.819775 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" event={"ID":"927f1ae3-c12f-42d2-9750-8d102628c18b","Type":"ContainerStarted","Data":"3a967ef8424ca5fcd15bf0147320afd4fe5cce0dc07879ffcd517940aa4da5b7"} Dec 02 09:10:49 crc kubenswrapper[4895]: I1202 09:10:49.919387 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" event={"ID":"927f1ae3-c12f-42d2-9750-8d102628c18b","Type":"ContainerStarted","Data":"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05"} Dec 02 09:10:49 crc kubenswrapper[4895]: I1202 09:10:49.956343 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" podStartSLOduration=2.9563207929999997 podStartE2EDuration="2.956320793s" podCreationTimestamp="2025-12-02 09:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:10:49.939677255 +0000 UTC m=+6461.110536878" watchObservedRunningTime="2025-12-02 09:10:49.956320793 +0000 UTC m=+6461.127180406" Dec 02 09:10:50 crc kubenswrapper[4895]: I1202 09:10:50.928807 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:56 crc kubenswrapper[4895]: I1202 09:10:56.141838 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:10:56 crc kubenswrapper[4895]: E1202 09:10:56.142862 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:10:57 crc kubenswrapper[4895]: I1202 09:10:57.839573 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:10:57 crc kubenswrapper[4895]: I1202 09:10:57.947886 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:10:57 crc kubenswrapper[4895]: I1202 09:10:57.948207 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="dnsmasq-dns" containerID="cri-o://75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867" gracePeriod=10 Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.111793 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7864bfbdf-hszwn"] Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.114380 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.119991 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7864bfbdf-hszwn"] Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277007 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-nb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277080 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-openstack-cell1\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277148 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-dns-svc\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277202 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-sb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277322 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5zx6\" (UniqueName: \"kubernetes.io/projected/c70683d8-a861-4c90-b092-41aad531f04e-kube-api-access-d5zx6\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.277390 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-config\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.379891 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5zx6\" (UniqueName: \"kubernetes.io/projected/c70683d8-a861-4c90-b092-41aad531f04e-kube-api-access-d5zx6\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.379989 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-config\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.380062 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-nb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.380088 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-openstack-cell1\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.380133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-dns-svc\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.380161 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-sb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.381457 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-config\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.381562 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-dns-svc\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.381651 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-nb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.382046 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-ovsdbserver-sb\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.382310 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c70683d8-a861-4c90-b092-41aad531f04e-openstack-cell1\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.422844 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5zx6\" (UniqueName: \"kubernetes.io/projected/c70683d8-a861-4c90-b092-41aad531f04e-kube-api-access-d5zx6\") pod \"dnsmasq-dns-7864bfbdf-hszwn\" (UID: \"c70683d8-a861-4c90-b092-41aad531f04e\") " pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.451355 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.616714 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.787816 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb\") pod \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.787946 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ztq\" (UniqueName: \"kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq\") pod \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.788005 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc\") pod \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.788069 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb\") pod \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.788139 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config\") pod \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\" (UID: \"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2\") " Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.793568 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq" (OuterVolumeSpecName: "kube-api-access-v8ztq") pod "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" (UID: "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2"). InnerVolumeSpecName "kube-api-access-v8ztq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.842622 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" (UID: "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.846507 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config" (OuterVolumeSpecName: "config") pod "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" (UID: "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.852164 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" (UID: "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.855347 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" (UID: "31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.891111 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.891361 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8ztq\" (UniqueName: \"kubernetes.io/projected/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-kube-api-access-v8ztq\") on node \"crc\" DevicePath \"\"" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.891444 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.891516 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.891584 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:10:58 crc kubenswrapper[4895]: I1202 09:10:58.969196 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7864bfbdf-hszwn"] Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.021700 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" event={"ID":"c70683d8-a861-4c90-b092-41aad531f04e","Type":"ContainerStarted","Data":"ccafa2ce53ddfa25021a3a8647adfb503e240b42b939dbd65dee28683f7da080"} Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.024392 4895 generic.go:334] "Generic (PLEG): container finished" podID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerID="75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867" exitCode=0 Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.024442 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" event={"ID":"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2","Type":"ContainerDied","Data":"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867"} Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.024475 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" event={"ID":"31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2","Type":"ContainerDied","Data":"014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c"} Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.024470 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db76f6d45-bbthz" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.024493 4895 scope.go:117] "RemoveContainer" containerID="75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.050103 4895 scope.go:117] "RemoveContainer" containerID="d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.069482 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.080231 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5db76f6d45-bbthz"] Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.167276 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" path="/var/lib/kubelet/pods/31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2/volumes" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.199442 4895 scope.go:117] "RemoveContainer" containerID="75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867" Dec 02 09:10:59 crc kubenswrapper[4895]: E1202 09:10:59.202470 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867\": container with ID starting with 75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867 not found: ID does not exist" containerID="75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.202529 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867"} err="failed to get container status \"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867\": rpc error: code = NotFound desc = could not find container \"75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867\": container with ID starting with 75bb04fab936f486061cd47c2777e6398ae7bb4fca90001c61377c210cf29867 not found: ID does not exist" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.202556 4895 scope.go:117] "RemoveContainer" containerID="d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4" Dec 02 09:10:59 crc kubenswrapper[4895]: E1202 09:10:59.203017 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4\": container with ID starting with d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4 not found: ID does not exist" containerID="d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4" Dec 02 09:10:59 crc kubenswrapper[4895]: I1202 09:10:59.203088 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4"} err="failed to get container status \"d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4\": rpc error: code = NotFound desc = could not find container \"d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4\": container with ID starting with d1eab92b762f488d1bafc4d2446c99f81fdd829002b641ac4826b9b9ff477ef4 not found: ID does not exist" Dec 02 09:10:59 crc kubenswrapper[4895]: E1202 09:10:59.295476 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31be70b7_4a75_4a4b_b0bc_51fdaedaf8c2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31be70b7_4a75_4a4b_b0bc_51fdaedaf8c2.slice/crio-014c15f3093c9a2a91f163631513edb303519454c6551b514195808d0135d69c\": RecentStats: unable to find data in memory cache]" Dec 02 09:11:00 crc kubenswrapper[4895]: I1202 09:11:00.035910 4895 generic.go:334] "Generic (PLEG): container finished" podID="c70683d8-a861-4c90-b092-41aad531f04e" containerID="38da4fd18e1b0479d9b5b4b6a95a42661e845c4599e36023f903af45e4ae801b" exitCode=0 Dec 02 09:11:00 crc kubenswrapper[4895]: I1202 09:11:00.035996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" event={"ID":"c70683d8-a861-4c90-b092-41aad531f04e","Type":"ContainerDied","Data":"38da4fd18e1b0479d9b5b4b6a95a42661e845c4599e36023f903af45e4ae801b"} Dec 02 09:11:01 crc kubenswrapper[4895]: I1202 09:11:01.049128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" event={"ID":"c70683d8-a861-4c90-b092-41aad531f04e","Type":"ContainerStarted","Data":"d1a13549f35407aa51e4248f3f7d7f84a22f392349cf7ae19255578eb4def196"} Dec 02 09:11:01 crc kubenswrapper[4895]: I1202 09:11:01.049985 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:11:01 crc kubenswrapper[4895]: I1202 09:11:01.079522 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" podStartSLOduration=3.079458055 podStartE2EDuration="3.079458055s" podCreationTimestamp="2025-12-02 09:10:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:11:01.072702575 +0000 UTC m=+6472.243562198" watchObservedRunningTime="2025-12-02 09:11:01.079458055 +0000 UTC m=+6472.250317678" Dec 02 09:11:08 crc kubenswrapper[4895]: I1202 09:11:08.453796 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7864bfbdf-hszwn" Dec 02 09:11:08 crc kubenswrapper[4895]: I1202 09:11:08.513679 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:11:08 crc kubenswrapper[4895]: I1202 09:11:08.514030 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="dnsmasq-dns" containerID="cri-o://2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05" gracePeriod=10 Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.029053 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.127725 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.128751 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxz9\" (UniqueName: \"kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.128975 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.129016 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.129070 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.129111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.129917 4895 generic.go:334] "Generic (PLEG): container finished" podID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerID="2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05" exitCode=0 Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.129966 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" event={"ID":"927f1ae3-c12f-42d2-9750-8d102628c18b","Type":"ContainerDied","Data":"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05"} Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.130000 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" event={"ID":"927f1ae3-c12f-42d2-9750-8d102628c18b","Type":"ContainerDied","Data":"3a967ef8424ca5fcd15bf0147320afd4fe5cce0dc07879ffcd517940aa4da5b7"} Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.130022 4895 scope.go:117] "RemoveContainer" containerID="2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.130161 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbfb99f9-67c58" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.158900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9" (OuterVolumeSpecName: "kube-api-access-djxz9") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "kube-api-access-djxz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.224115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.224184 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.227658 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config" (OuterVolumeSpecName: "config") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.230343 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.230557 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") pod \"927f1ae3-c12f-42d2-9750-8d102628c18b\" (UID: \"927f1ae3-c12f-42d2-9750-8d102628c18b\") " Dec 02 09:11:09 crc kubenswrapper[4895]: W1202 09:11:09.230643 4895 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/927f1ae3-c12f-42d2-9750-8d102628c18b/volumes/kubernetes.io~configmap/openstack-cell1 Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.230664 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231133 4895 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231154 4895 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231163 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231172 4895 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-config\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231181 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxz9\" (UniqueName: \"kubernetes.io/projected/927f1ae3-c12f-42d2-9750-8d102628c18b-kube-api-access-djxz9\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.231357 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "927f1ae3-c12f-42d2-9750-8d102628c18b" (UID: "927f1ae3-c12f-42d2-9750-8d102628c18b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.320093 4895 scope.go:117] "RemoveContainer" containerID="390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.334166 4895 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/927f1ae3-c12f-42d2-9750-8d102628c18b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.362390 4895 scope.go:117] "RemoveContainer" containerID="2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05" Dec 02 09:11:09 crc kubenswrapper[4895]: E1202 09:11:09.363580 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05\": container with ID starting with 2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05 not found: ID does not exist" containerID="2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.363616 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05"} err="failed to get container status \"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05\": rpc error: code = NotFound desc = could not find container \"2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05\": container with ID starting with 2006369ee322824dbdd417083af119d371e2c143dda561a35a11dee7823dce05 not found: ID does not exist" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.363639 4895 scope.go:117] "RemoveContainer" containerID="390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe" Dec 02 09:11:09 crc kubenswrapper[4895]: E1202 09:11:09.363855 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe\": container with ID starting with 390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe not found: ID does not exist" containerID="390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.363877 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe"} err="failed to get container status \"390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe\": rpc error: code = NotFound desc = could not find container \"390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe\": container with ID starting with 390fa6dc4f7436e0069f370daa2847374a270889b2b98c95719e0ebf4cd881fe not found: ID does not exist" Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.483039 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:11:09 crc kubenswrapper[4895]: I1202 09:11:09.493250 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cbfb99f9-67c58"] Dec 02 09:11:11 crc kubenswrapper[4895]: I1202 09:11:11.141361 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:11:11 crc kubenswrapper[4895]: E1202 09:11:11.142149 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:11:11 crc kubenswrapper[4895]: I1202 09:11:11.153478 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" path="/var/lib/kubelet/pods/927f1ae3-c12f-42d2-9750-8d102628c18b/volumes" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.449073 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk"] Dec 02 09:11:19 crc kubenswrapper[4895]: E1202 09:11:19.451062 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="init" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451084 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="init" Dec 02 09:11:19 crc kubenswrapper[4895]: E1202 09:11:19.451103 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="init" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451110 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="init" Dec 02 09:11:19 crc kubenswrapper[4895]: E1202 09:11:19.451126 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451135 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: E1202 09:11:19.451155 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451164 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451467 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="31be70b7-4a75-4a4b-b0bc-51fdaedaf8c2" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.451496 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="927f1ae3-c12f-42d2-9750-8d102628c18b" containerName="dnsmasq-dns" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.452724 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.455638 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.455816 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.456146 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.457390 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.464807 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk"] Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.566040 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2cx\" (UniqueName: \"kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.566136 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.566695 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.567036 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.567143 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.668704 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.669195 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.669260 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2cx\" (UniqueName: \"kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.669297 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.669453 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.675354 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.675355 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.675839 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.678495 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.688146 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2cx\" (UniqueName: \"kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:19 crc kubenswrapper[4895]: I1202 09:11:19.781201 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:20 crc kubenswrapper[4895]: I1202 09:11:20.328194 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk"] Dec 02 09:11:20 crc kubenswrapper[4895]: W1202 09:11:20.329478 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode67687a6_5862_4747_ae07_1bd20e752c11.slice/crio-619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00 WatchSource:0}: Error finding container 619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00: Status 404 returned error can't find the container with id 619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00 Dec 02 09:11:21 crc kubenswrapper[4895]: I1202 09:11:21.277395 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" event={"ID":"e67687a6-5862-4747-ae07-1bd20e752c11","Type":"ContainerStarted","Data":"619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00"} Dec 02 09:11:25 crc kubenswrapper[4895]: I1202 09:11:25.141845 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:11:25 crc kubenswrapper[4895]: E1202 09:11:25.142879 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:11:30 crc kubenswrapper[4895]: I1202 09:11:30.053185 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-8kqxg"] Dec 02 09:11:30 crc kubenswrapper[4895]: I1202 09:11:30.063881 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-8kqxg"] Dec 02 09:11:31 crc kubenswrapper[4895]: I1202 09:11:31.035866 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-9e34-account-create-update-5c4v7"] Dec 02 09:11:31 crc kubenswrapper[4895]: I1202 09:11:31.048550 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-9e34-account-create-update-5c4v7"] Dec 02 09:11:31 crc kubenswrapper[4895]: I1202 09:11:31.156572 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b86914c-f3af-4d62-b57a-9b94de461aea" path="/var/lib/kubelet/pods/3b86914c-f3af-4d62-b57a-9b94de461aea/volumes" Dec 02 09:11:31 crc kubenswrapper[4895]: I1202 09:11:31.158273 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4774a86-a00e-4910-91eb-34fb42e710ef" path="/var/lib/kubelet/pods/a4774a86-a00e-4910-91eb-34fb42e710ef/volumes" Dec 02 09:11:33 crc kubenswrapper[4895]: I1202 09:11:33.431531 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" event={"ID":"e67687a6-5862-4747-ae07-1bd20e752c11","Type":"ContainerStarted","Data":"11552d26281d9ee79eb14cfc5f207e937d2933c8309679b70632180742bf0be2"} Dec 02 09:11:33 crc kubenswrapper[4895]: I1202 09:11:33.457963 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" podStartSLOduration=2.529779603 podStartE2EDuration="14.457944617s" podCreationTimestamp="2025-12-02 09:11:19 +0000 UTC" firstStartedPulling="2025-12-02 09:11:20.331787957 +0000 UTC m=+6491.502647570" lastFinishedPulling="2025-12-02 09:11:32.259952971 +0000 UTC m=+6503.430812584" observedRunningTime="2025-12-02 09:11:33.449430762 +0000 UTC m=+6504.620290405" watchObservedRunningTime="2025-12-02 09:11:33.457944617 +0000 UTC m=+6504.628804230" Dec 02 09:11:37 crc kubenswrapper[4895]: I1202 09:11:37.739363 4895 scope.go:117] "RemoveContainer" containerID="b0ed2dbede21c00b312cadb52185016964e796c0330238ad8137e24d9c5f7df2" Dec 02 09:11:37 crc kubenswrapper[4895]: I1202 09:11:37.767389 4895 scope.go:117] "RemoveContainer" containerID="285b94e07574b2f07f23ebbcd8f4389c2d205eb23d9b7f75fb2e44410260b53a" Dec 02 09:11:38 crc kubenswrapper[4895]: I1202 09:11:38.047958 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-jwddv"] Dec 02 09:11:38 crc kubenswrapper[4895]: I1202 09:11:38.058842 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-jwddv"] Dec 02 09:11:39 crc kubenswrapper[4895]: I1202 09:11:39.030920 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c481-account-create-update-dgk2p"] Dec 02 09:11:39 crc kubenswrapper[4895]: I1202 09:11:39.040786 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c481-account-create-update-dgk2p"] Dec 02 09:11:39 crc kubenswrapper[4895]: I1202 09:11:39.152903 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd2ca4e-e47c-49aa-827f-4ecf5760939e" path="/var/lib/kubelet/pods/2bd2ca4e-e47c-49aa-827f-4ecf5760939e/volumes" Dec 02 09:11:39 crc kubenswrapper[4895]: I1202 09:11:39.154227 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0441a95-f89f-481e-834d-f508174764ae" path="/var/lib/kubelet/pods/c0441a95-f89f-481e-834d-f508174764ae/volumes" Dec 02 09:11:40 crc kubenswrapper[4895]: I1202 09:11:40.141129 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:11:40 crc kubenswrapper[4895]: E1202 09:11:40.141416 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.199383 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.203842 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.219638 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.283043 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.283111 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdnm\" (UniqueName: \"kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.283155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.385042 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.385091 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdnm\" (UniqueName: \"kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.385124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.385721 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.386027 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.405489 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdnm\" (UniqueName: \"kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm\") pod \"redhat-operators-td5jd\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:42 crc kubenswrapper[4895]: I1202 09:11:42.540851 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.023515 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.553114 4895 generic.go:334] "Generic (PLEG): container finished" podID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerID="f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9" exitCode=0 Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.553244 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerDied","Data":"f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9"} Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.553516 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerStarted","Data":"71b91767c654fcc137478af6e38f21d6a0c90b8b27b4bfd1dfbdef2587c6d755"} Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.992472 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:11:43 crc kubenswrapper[4895]: I1202 09:11:43.995508 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.006174 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.028435 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.028536 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.028569 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn86g\" (UniqueName: \"kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.130331 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.130383 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn86g\" (UniqueName: \"kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.130593 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.130951 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.131138 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.152099 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn86g\" (UniqueName: \"kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g\") pod \"community-operators-qf4z2\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.333203 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:44 crc kubenswrapper[4895]: I1202 09:11:44.887097 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.578269 4895 generic.go:334] "Generic (PLEG): container finished" podID="e67687a6-5862-4747-ae07-1bd20e752c11" containerID="11552d26281d9ee79eb14cfc5f207e937d2933c8309679b70632180742bf0be2" exitCode=0 Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.578704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" event={"ID":"e67687a6-5862-4747-ae07-1bd20e752c11","Type":"ContainerDied","Data":"11552d26281d9ee79eb14cfc5f207e937d2933c8309679b70632180742bf0be2"} Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.583041 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerStarted","Data":"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9"} Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.585258 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerID="6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c" exitCode=0 Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.585299 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerDied","Data":"6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c"} Dec 02 09:11:45 crc kubenswrapper[4895]: I1202 09:11:45.585323 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerStarted","Data":"936f6a86b32822b26913b3924dc355d78940eaa67b9c98bb2152e1500fd11d58"} Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.053428 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.198998 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key\") pod \"e67687a6-5862-4747-ae07-1bd20e752c11\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.199132 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl2cx\" (UniqueName: \"kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx\") pod \"e67687a6-5862-4747-ae07-1bd20e752c11\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.199231 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory\") pod \"e67687a6-5862-4747-ae07-1bd20e752c11\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.199274 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle\") pod \"e67687a6-5862-4747-ae07-1bd20e752c11\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.199351 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph\") pod \"e67687a6-5862-4747-ae07-1bd20e752c11\" (UID: \"e67687a6-5862-4747-ae07-1bd20e752c11\") " Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.278585 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx" (OuterVolumeSpecName: "kube-api-access-pl2cx") pod "e67687a6-5862-4747-ae07-1bd20e752c11" (UID: "e67687a6-5862-4747-ae07-1bd20e752c11"). InnerVolumeSpecName "kube-api-access-pl2cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.281447 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph" (OuterVolumeSpecName: "ceph") pod "e67687a6-5862-4747-ae07-1bd20e752c11" (UID: "e67687a6-5862-4747-ae07-1bd20e752c11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.282123 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "e67687a6-5862-4747-ae07-1bd20e752c11" (UID: "e67687a6-5862-4747-ae07-1bd20e752c11"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.313641 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl2cx\" (UniqueName: \"kubernetes.io/projected/e67687a6-5862-4747-ae07-1bd20e752c11-kube-api-access-pl2cx\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.313683 4895 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.313695 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.313643 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e67687a6-5862-4747-ae07-1bd20e752c11" (UID: "e67687a6-5862-4747-ae07-1bd20e752c11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.358547 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory" (OuterVolumeSpecName: "inventory") pod "e67687a6-5862-4747-ae07-1bd20e752c11" (UID: "e67687a6-5862-4747-ae07-1bd20e752c11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.415895 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.415938 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e67687a6-5862-4747-ae07-1bd20e752c11-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.605199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" event={"ID":"e67687a6-5862-4747-ae07-1bd20e752c11","Type":"ContainerDied","Data":"619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00"} Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.605273 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619abffb5b8dcf78fed81b3298a2bba4af27e84ffd273c0e0d3a02b090d4ec00" Dec 02 09:11:47 crc kubenswrapper[4895]: I1202 09:11:47.605278 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk" Dec 02 09:11:48 crc kubenswrapper[4895]: I1202 09:11:48.615383 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerStarted","Data":"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e"} Dec 02 09:11:50 crc kubenswrapper[4895]: I1202 09:11:50.640997 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerID="982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e" exitCode=0 Dec 02 09:11:50 crc kubenswrapper[4895]: I1202 09:11:50.641062 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerDied","Data":"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e"} Dec 02 09:11:50 crc kubenswrapper[4895]: I1202 09:11:50.644909 4895 generic.go:334] "Generic (PLEG): container finished" podID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerID="33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9" exitCode=0 Dec 02 09:11:50 crc kubenswrapper[4895]: I1202 09:11:50.644992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerDied","Data":"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9"} Dec 02 09:11:51 crc kubenswrapper[4895]: I1202 09:11:51.657463 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerStarted","Data":"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7"} Dec 02 09:11:51 crc kubenswrapper[4895]: I1202 09:11:51.684431 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qf4z2" podStartSLOduration=2.841738645 podStartE2EDuration="8.684410201s" podCreationTimestamp="2025-12-02 09:11:43 +0000 UTC" firstStartedPulling="2025-12-02 09:11:45.587408657 +0000 UTC m=+6516.758268270" lastFinishedPulling="2025-12-02 09:11:51.430080223 +0000 UTC m=+6522.600939826" observedRunningTime="2025-12-02 09:11:51.675414451 +0000 UTC m=+6522.846274074" watchObservedRunningTime="2025-12-02 09:11:51.684410201 +0000 UTC m=+6522.855269824" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.141145 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:11:52 crc kubenswrapper[4895]: E1202 09:11:52.141431 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.189802 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k"] Dec 02 09:11:52 crc kubenswrapper[4895]: E1202 09:11:52.190348 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67687a6-5862-4747-ae07-1bd20e752c11" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.190373 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67687a6-5862-4747-ae07-1bd20e752c11" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.190664 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67687a6-5862-4747-ae07-1bd20e752c11" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.191615 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.194649 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.194875 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.195041 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.195254 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.200613 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k"] Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.328281 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.328433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.328508 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.328571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdn4g\" (UniqueName: \"kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.328691 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.430652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.431157 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.431213 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdn4g\" (UniqueName: \"kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.431289 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.431389 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.441467 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.441675 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.441807 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.447761 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.451109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdn4g\" (UniqueName: \"kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.612716 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.674606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerStarted","Data":"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8"} Dec 02 09:11:52 crc kubenswrapper[4895]: I1202 09:11:52.701813 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-td5jd" podStartSLOduration=2.318566246 podStartE2EDuration="10.701784245s" podCreationTimestamp="2025-12-02 09:11:42 +0000 UTC" firstStartedPulling="2025-12-02 09:11:43.555379115 +0000 UTC m=+6514.726238728" lastFinishedPulling="2025-12-02 09:11:51.938597114 +0000 UTC m=+6523.109456727" observedRunningTime="2025-12-02 09:11:52.698933196 +0000 UTC m=+6523.869792839" watchObservedRunningTime="2025-12-02 09:11:52.701784245 +0000 UTC m=+6523.872643868" Dec 02 09:11:53 crc kubenswrapper[4895]: I1202 09:11:53.272011 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k"] Dec 02 09:11:53 crc kubenswrapper[4895]: W1202 09:11:53.277309 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdefa681_0c42_4f77_81e9_19fc3ae7a940.slice/crio-70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d WatchSource:0}: Error finding container 70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d: Status 404 returned error can't find the container with id 70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d Dec 02 09:11:53 crc kubenswrapper[4895]: I1202 09:11:53.687254 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" event={"ID":"fdefa681-0c42-4f77-81e9-19fc3ae7a940","Type":"ContainerStarted","Data":"70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d"} Dec 02 09:11:54 crc kubenswrapper[4895]: I1202 09:11:54.333639 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:54 crc kubenswrapper[4895]: I1202 09:11:54.333707 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:11:54 crc kubenswrapper[4895]: I1202 09:11:54.699336 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" event={"ID":"fdefa681-0c42-4f77-81e9-19fc3ae7a940","Type":"ContainerStarted","Data":"98b6665c802e1a61fb8552f44a2b8476061bd6f8a199478f50fc7abc6a3d74ba"} Dec 02 09:11:54 crc kubenswrapper[4895]: I1202 09:11:54.720927 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" podStartSLOduration=2.516934145 podStartE2EDuration="2.720905135s" podCreationTimestamp="2025-12-02 09:11:52 +0000 UTC" firstStartedPulling="2025-12-02 09:11:53.28461919 +0000 UTC m=+6524.455478803" lastFinishedPulling="2025-12-02 09:11:53.48859018 +0000 UTC m=+6524.659449793" observedRunningTime="2025-12-02 09:11:54.714894308 +0000 UTC m=+6525.885753931" watchObservedRunningTime="2025-12-02 09:11:54.720905135 +0000 UTC m=+6525.891764748" Dec 02 09:11:55 crc kubenswrapper[4895]: I1202 09:11:55.381187 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qf4z2" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="registry-server" probeResult="failure" output=< Dec 02 09:11:55 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:11:55 crc kubenswrapper[4895]: > Dec 02 09:12:02 crc kubenswrapper[4895]: I1202 09:12:02.541599 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:02 crc kubenswrapper[4895]: I1202 09:12:02.542229 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:03 crc kubenswrapper[4895]: I1202 09:12:03.593113 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-td5jd" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="registry-server" probeResult="failure" output=< Dec 02 09:12:03 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:12:03 crc kubenswrapper[4895]: > Dec 02 09:12:04 crc kubenswrapper[4895]: I1202 09:12:04.141237 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:12:04 crc kubenswrapper[4895]: E1202 09:12:04.141733 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:12:04 crc kubenswrapper[4895]: I1202 09:12:04.383857 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:12:04 crc kubenswrapper[4895]: I1202 09:12:04.439563 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:12:04 crc kubenswrapper[4895]: I1202 09:12:04.634792 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:12:05 crc kubenswrapper[4895]: I1202 09:12:05.819531 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qf4z2" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="registry-server" containerID="cri-o://106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7" gracePeriod=2 Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.312570 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.418495 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities\") pod \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.418850 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn86g\" (UniqueName: \"kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g\") pod \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.419026 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content\") pod \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\" (UID: \"1a8ea5f4-253e-4009-a9b5-b92eea450bb7\") " Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.419516 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities" (OuterVolumeSpecName: "utilities") pod "1a8ea5f4-253e-4009-a9b5-b92eea450bb7" (UID: "1a8ea5f4-253e-4009-a9b5-b92eea450bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.420989 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.425916 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g" (OuterVolumeSpecName: "kube-api-access-tn86g") pod "1a8ea5f4-253e-4009-a9b5-b92eea450bb7" (UID: "1a8ea5f4-253e-4009-a9b5-b92eea450bb7"). InnerVolumeSpecName "kube-api-access-tn86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.470489 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a8ea5f4-253e-4009-a9b5-b92eea450bb7" (UID: "1a8ea5f4-253e-4009-a9b5-b92eea450bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.523366 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.523419 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn86g\" (UniqueName: \"kubernetes.io/projected/1a8ea5f4-253e-4009-a9b5-b92eea450bb7-kube-api-access-tn86g\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.831849 4895 generic.go:334] "Generic (PLEG): container finished" podID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerID="106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7" exitCode=0 Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.831898 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerDied","Data":"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7"} Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.831931 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf4z2" event={"ID":"1a8ea5f4-253e-4009-a9b5-b92eea450bb7","Type":"ContainerDied","Data":"936f6a86b32822b26913b3924dc355d78940eaa67b9c98bb2152e1500fd11d58"} Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.831956 4895 scope.go:117] "RemoveContainer" containerID="106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.832121 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf4z2" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.893873 4895 scope.go:117] "RemoveContainer" containerID="982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.894012 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.908223 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qf4z2"] Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.926940 4895 scope.go:117] "RemoveContainer" containerID="6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.980218 4895 scope.go:117] "RemoveContainer" containerID="106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7" Dec 02 09:12:06 crc kubenswrapper[4895]: E1202 09:12:06.981359 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7\": container with ID starting with 106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7 not found: ID does not exist" containerID="106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.981392 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7"} err="failed to get container status \"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7\": rpc error: code = NotFound desc = could not find container \"106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7\": container with ID starting with 106fd03b34b86e5613aad63b0f37dae8d0e72e8e4957747be78f46a3d4473ed7 not found: ID does not exist" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.981413 4895 scope.go:117] "RemoveContainer" containerID="982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e" Dec 02 09:12:06 crc kubenswrapper[4895]: E1202 09:12:06.981788 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e\": container with ID starting with 982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e not found: ID does not exist" containerID="982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.981840 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e"} err="failed to get container status \"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e\": rpc error: code = NotFound desc = could not find container \"982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e\": container with ID starting with 982ce4c3aeeb1494825f4b8a85929ed310f1c8636d289a3071b7b77b312ddc1e not found: ID does not exist" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.981870 4895 scope.go:117] "RemoveContainer" containerID="6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c" Dec 02 09:12:06 crc kubenswrapper[4895]: E1202 09:12:06.982354 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c\": container with ID starting with 6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c not found: ID does not exist" containerID="6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c" Dec 02 09:12:06 crc kubenswrapper[4895]: I1202 09:12:06.982389 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c"} err="failed to get container status \"6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c\": rpc error: code = NotFound desc = could not find container \"6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c\": container with ID starting with 6b6622d92e5072b62f12c15f66aa8718c120bff55bfb61b0ba604a697d06700c not found: ID does not exist" Dec 02 09:12:07 crc kubenswrapper[4895]: I1202 09:12:07.153555 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" path="/var/lib/kubelet/pods/1a8ea5f4-253e-4009-a9b5-b92eea450bb7/volumes" Dec 02 09:12:12 crc kubenswrapper[4895]: I1202 09:12:12.592338 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:12 crc kubenswrapper[4895]: I1202 09:12:12.648968 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:13 crc kubenswrapper[4895]: I1202 09:12:13.050315 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-79stx"] Dec 02 09:12:13 crc kubenswrapper[4895]: I1202 09:12:13.068998 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-79stx"] Dec 02 09:12:13 crc kubenswrapper[4895]: I1202 09:12:13.155028 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304eb43e-b503-4343-945e-3d4def10dc47" path="/var/lib/kubelet/pods/304eb43e-b503-4343-945e-3d4def10dc47/volumes" Dec 02 09:12:13 crc kubenswrapper[4895]: I1202 09:12:13.404375 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:12:13 crc kubenswrapper[4895]: I1202 09:12:13.913494 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-td5jd" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="registry-server" containerID="cri-o://dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8" gracePeriod=2 Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.412387 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.595299 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities\") pod \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.595437 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content\") pod \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.595499 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcdnm\" (UniqueName: \"kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm\") pod \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\" (UID: \"14ce8a9c-33c5-4915-9c12-1025afec5b8e\") " Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.596446 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities" (OuterVolumeSpecName: "utilities") pod "14ce8a9c-33c5-4915-9c12-1025afec5b8e" (UID: "14ce8a9c-33c5-4915-9c12-1025afec5b8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.601575 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm" (OuterVolumeSpecName: "kube-api-access-rcdnm") pod "14ce8a9c-33c5-4915-9c12-1025afec5b8e" (UID: "14ce8a9c-33c5-4915-9c12-1025afec5b8e"). InnerVolumeSpecName "kube-api-access-rcdnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.698427 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcdnm\" (UniqueName: \"kubernetes.io/projected/14ce8a9c-33c5-4915-9c12-1025afec5b8e-kube-api-access-rcdnm\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.698453 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.704407 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14ce8a9c-33c5-4915-9c12-1025afec5b8e" (UID: "14ce8a9c-33c5-4915-9c12-1025afec5b8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.800295 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ce8a9c-33c5-4915-9c12-1025afec5b8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.925258 4895 generic.go:334] "Generic (PLEG): container finished" podID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerID="dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8" exitCode=0 Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.925316 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerDied","Data":"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8"} Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.925328 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-td5jd" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.925359 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-td5jd" event={"ID":"14ce8a9c-33c5-4915-9c12-1025afec5b8e","Type":"ContainerDied","Data":"71b91767c654fcc137478af6e38f21d6a0c90b8b27b4bfd1dfbdef2587c6d755"} Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.925391 4895 scope.go:117] "RemoveContainer" containerID="dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.947602 4895 scope.go:117] "RemoveContainer" containerID="33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9" Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.962596 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.971184 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-td5jd"] Dec 02 09:12:14 crc kubenswrapper[4895]: I1202 09:12:14.991358 4895 scope.go:117] "RemoveContainer" containerID="f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.027942 4895 scope.go:117] "RemoveContainer" containerID="dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8" Dec 02 09:12:15 crc kubenswrapper[4895]: E1202 09:12:15.028859 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8\": container with ID starting with dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8 not found: ID does not exist" containerID="dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.028896 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8"} err="failed to get container status \"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8\": rpc error: code = NotFound desc = could not find container \"dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8\": container with ID starting with dc01504f0b5a8bb11a72bded3a0b41f1914c46d0699db3ced36a66ea9ae35fe8 not found: ID does not exist" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.028948 4895 scope.go:117] "RemoveContainer" containerID="33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9" Dec 02 09:12:15 crc kubenswrapper[4895]: E1202 09:12:15.029673 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9\": container with ID starting with 33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9 not found: ID does not exist" containerID="33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.029731 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9"} err="failed to get container status \"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9\": rpc error: code = NotFound desc = could not find container \"33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9\": container with ID starting with 33cf30f0a578966587f4403bd0d9b623099a85e542a5f83fa00be046e3502bf9 not found: ID does not exist" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.029789 4895 scope.go:117] "RemoveContainer" containerID="f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9" Dec 02 09:12:15 crc kubenswrapper[4895]: E1202 09:12:15.030411 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9\": container with ID starting with f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9 not found: ID does not exist" containerID="f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.030488 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9"} err="failed to get container status \"f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9\": rpc error: code = NotFound desc = could not find container \"f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9\": container with ID starting with f6ff794ba497ccc577ae9e417d7e72823a612639f8b1431af762f456398437b9 not found: ID does not exist" Dec 02 09:12:15 crc kubenswrapper[4895]: I1202 09:12:15.154992 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" path="/var/lib/kubelet/pods/14ce8a9c-33c5-4915-9c12-1025afec5b8e/volumes" Dec 02 09:12:16 crc kubenswrapper[4895]: I1202 09:12:16.141183 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:12:16 crc kubenswrapper[4895]: E1202 09:12:16.141796 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:12:31 crc kubenswrapper[4895]: I1202 09:12:31.141381 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:12:31 crc kubenswrapper[4895]: E1202 09:12:31.142360 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:12:37 crc kubenswrapper[4895]: I1202 09:12:37.909401 4895 scope.go:117] "RemoveContainer" containerID="ef5ddaef6e24d0fd2d197527dbce983ef87f695d5ff1a9dac740e6c0d48a6617" Dec 02 09:12:37 crc kubenswrapper[4895]: I1202 09:12:37.938722 4895 scope.go:117] "RemoveContainer" containerID="2748d052530f0be06d2597be6369b313b5eef17691f7a0fe1f240c73614b34a1" Dec 02 09:12:38 crc kubenswrapper[4895]: I1202 09:12:38.004920 4895 scope.go:117] "RemoveContainer" containerID="8550221e31485856ef3de33c5649eecefe408e448419218a943a0f463d043ec4" Dec 02 09:12:38 crc kubenswrapper[4895]: I1202 09:12:38.044492 4895 scope.go:117] "RemoveContainer" containerID="f048d3a2d0cd2fc93ea28d7c1aeba014683e8b1518540c2cf6daa121b589fb64" Dec 02 09:12:44 crc kubenswrapper[4895]: I1202 09:12:44.141647 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:12:44 crc kubenswrapper[4895]: E1202 09:12:44.142619 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:12:57 crc kubenswrapper[4895]: I1202 09:12:57.141705 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:12:57 crc kubenswrapper[4895]: E1202 09:12:57.142692 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:13:11 crc kubenswrapper[4895]: I1202 09:13:11.145982 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:13:11 crc kubenswrapper[4895]: E1202 09:13:11.146877 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:13:25 crc kubenswrapper[4895]: I1202 09:13:25.141270 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:13:25 crc kubenswrapper[4895]: E1202 09:13:25.142355 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:13:39 crc kubenswrapper[4895]: I1202 09:13:39.148572 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:13:39 crc kubenswrapper[4895]: E1202 09:13:39.150089 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:13:54 crc kubenswrapper[4895]: I1202 09:13:54.141134 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:13:54 crc kubenswrapper[4895]: E1202 09:13:54.142044 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:14:06 crc kubenswrapper[4895]: I1202 09:14:06.141914 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:14:06 crc kubenswrapper[4895]: E1202 09:14:06.142778 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:14:17 crc kubenswrapper[4895]: I1202 09:14:17.141992 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:14:17 crc kubenswrapper[4895]: E1202 09:14:17.142840 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:14:28 crc kubenswrapper[4895]: I1202 09:14:28.140872 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:14:28 crc kubenswrapper[4895]: E1202 09:14:28.141616 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:14:40 crc kubenswrapper[4895]: I1202 09:14:40.142704 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:14:40 crc kubenswrapper[4895]: I1202 09:14:40.406666 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e"} Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.567819 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569026 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569047 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569061 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569068 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569078 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="extract-content" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569084 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="extract-content" Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569110 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="extract-utilities" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569117 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="extract-utilities" Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569148 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="extract-content" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569153 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="extract-content" Dec 02 09:14:43 crc kubenswrapper[4895]: E1202 09:14:43.569165 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="extract-utilities" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569170 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="extract-utilities" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569409 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8ea5f4-253e-4009-a9b5-b92eea450bb7" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.569434 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce8a9c-33c5-4915-9c12-1025afec5b8e" containerName="registry-server" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.571670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.581475 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.661157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.661649 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xsm\" (UniqueName: \"kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.661824 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.764437 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xsm\" (UniqueName: \"kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.764541 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.764661 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.765181 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.765178 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.795925 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xsm\" (UniqueName: \"kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm\") pod \"redhat-marketplace-tlft9\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:43 crc kubenswrapper[4895]: I1202 09:14:43.899057 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:44 crc kubenswrapper[4895]: I1202 09:14:44.443671 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:45 crc kubenswrapper[4895]: I1202 09:14:45.457088 4895 generic.go:334] "Generic (PLEG): container finished" podID="edf00756-08af-48df-8028-f91a84df1746" containerID="0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a" exitCode=0 Dec 02 09:14:45 crc kubenswrapper[4895]: I1202 09:14:45.457184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerDied","Data":"0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a"} Dec 02 09:14:45 crc kubenswrapper[4895]: I1202 09:14:45.457815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerStarted","Data":"d47acaa2f87d2e91909d0fa130bd6f1eff320e77c04c25c82a5de48f199c6360"} Dec 02 09:14:45 crc kubenswrapper[4895]: I1202 09:14:45.461439 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:14:47 crc kubenswrapper[4895]: I1202 09:14:47.486027 4895 generic.go:334] "Generic (PLEG): container finished" podID="edf00756-08af-48df-8028-f91a84df1746" containerID="3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20" exitCode=0 Dec 02 09:14:47 crc kubenswrapper[4895]: I1202 09:14:47.486186 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerDied","Data":"3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20"} Dec 02 09:14:48 crc kubenswrapper[4895]: I1202 09:14:48.503923 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerStarted","Data":"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb"} Dec 02 09:14:48 crc kubenswrapper[4895]: I1202 09:14:48.533425 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tlft9" podStartSLOduration=2.723090688 podStartE2EDuration="5.533401379s" podCreationTimestamp="2025-12-02 09:14:43 +0000 UTC" firstStartedPulling="2025-12-02 09:14:45.461208895 +0000 UTC m=+6696.632068508" lastFinishedPulling="2025-12-02 09:14:48.271519586 +0000 UTC m=+6699.442379199" observedRunningTime="2025-12-02 09:14:48.523645736 +0000 UTC m=+6699.694505379" watchObservedRunningTime="2025-12-02 09:14:48.533401379 +0000 UTC m=+6699.704260992" Dec 02 09:14:53 crc kubenswrapper[4895]: I1202 09:14:53.899965 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:53 crc kubenswrapper[4895]: I1202 09:14:53.900452 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:53 crc kubenswrapper[4895]: I1202 09:14:53.948941 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:54 crc kubenswrapper[4895]: I1202 09:14:54.627905 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:54 crc kubenswrapper[4895]: I1202 09:14:54.676780 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:56 crc kubenswrapper[4895]: I1202 09:14:56.587848 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tlft9" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="registry-server" containerID="cri-o://6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb" gracePeriod=2 Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.094859 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.224014 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content\") pod \"edf00756-08af-48df-8028-f91a84df1746\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.224079 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xsm\" (UniqueName: \"kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm\") pod \"edf00756-08af-48df-8028-f91a84df1746\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.224120 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities\") pod \"edf00756-08af-48df-8028-f91a84df1746\" (UID: \"edf00756-08af-48df-8028-f91a84df1746\") " Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.225290 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities" (OuterVolumeSpecName: "utilities") pod "edf00756-08af-48df-8028-f91a84df1746" (UID: "edf00756-08af-48df-8028-f91a84df1746"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.230600 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm" (OuterVolumeSpecName: "kube-api-access-54xsm") pod "edf00756-08af-48df-8028-f91a84df1746" (UID: "edf00756-08af-48df-8028-f91a84df1746"). InnerVolumeSpecName "kube-api-access-54xsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.242995 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edf00756-08af-48df-8028-f91a84df1746" (UID: "edf00756-08af-48df-8028-f91a84df1746"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.327580 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.327909 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xsm\" (UniqueName: \"kubernetes.io/projected/edf00756-08af-48df-8028-f91a84df1746-kube-api-access-54xsm\") on node \"crc\" DevicePath \"\"" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.327924 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edf00756-08af-48df-8028-f91a84df1746-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.599869 4895 generic.go:334] "Generic (PLEG): container finished" podID="edf00756-08af-48df-8028-f91a84df1746" containerID="6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb" exitCode=0 Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.599915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerDied","Data":"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb"} Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.599948 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlft9" event={"ID":"edf00756-08af-48df-8028-f91a84df1746","Type":"ContainerDied","Data":"d47acaa2f87d2e91909d0fa130bd6f1eff320e77c04c25c82a5de48f199c6360"} Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.599944 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlft9" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.599963 4895 scope.go:117] "RemoveContainer" containerID="6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.628135 4895 scope.go:117] "RemoveContainer" containerID="3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.639073 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.652735 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlft9"] Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.661854 4895 scope.go:117] "RemoveContainer" containerID="0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.714571 4895 scope.go:117] "RemoveContainer" containerID="6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb" Dec 02 09:14:57 crc kubenswrapper[4895]: E1202 09:14:57.715273 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb\": container with ID starting with 6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb not found: ID does not exist" containerID="6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.715412 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb"} err="failed to get container status \"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb\": rpc error: code = NotFound desc = could not find container \"6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb\": container with ID starting with 6c8c37d4934738f76ca21e9cbf0e01f389e9e55bc16e46bc9a54dfa0fbb1accb not found: ID does not exist" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.715517 4895 scope.go:117] "RemoveContainer" containerID="3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20" Dec 02 09:14:57 crc kubenswrapper[4895]: E1202 09:14:57.716018 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20\": container with ID starting with 3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20 not found: ID does not exist" containerID="3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.716059 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20"} err="failed to get container status \"3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20\": rpc error: code = NotFound desc = could not find container \"3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20\": container with ID starting with 3d2973a4a710179110f34d1176a20a141fc146765d6757f9b5a8236c1113dc20 not found: ID does not exist" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.716103 4895 scope.go:117] "RemoveContainer" containerID="0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a" Dec 02 09:14:57 crc kubenswrapper[4895]: E1202 09:14:57.716589 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a\": container with ID starting with 0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a not found: ID does not exist" containerID="0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a" Dec 02 09:14:57 crc kubenswrapper[4895]: I1202 09:14:57.716700 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a"} err="failed to get container status \"0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a\": rpc error: code = NotFound desc = could not find container \"0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a\": container with ID starting with 0c508dba08729f32114e5bfb0a3cb885427bc7b5127583d8666bf0a98bf83b3a not found: ID does not exist" Dec 02 09:14:59 crc kubenswrapper[4895]: I1202 09:14:59.155802 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf00756-08af-48df-8028-f91a84df1746" path="/var/lib/kubelet/pods/edf00756-08af-48df-8028-f91a84df1746/volumes" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.152654 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n"] Dec 02 09:15:00 crc kubenswrapper[4895]: E1202 09:15:00.154029 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="extract-content" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.154050 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="extract-content" Dec 02 09:15:00 crc kubenswrapper[4895]: E1202 09:15:00.154067 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="extract-utilities" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.154074 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="extract-utilities" Dec 02 09:15:00 crc kubenswrapper[4895]: E1202 09:15:00.154085 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="registry-server" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.154091 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="registry-server" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.154336 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf00756-08af-48df-8028-f91a84df1746" containerName="registry-server" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.155378 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.157445 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.158372 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.181251 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n"] Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.300229 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.300316 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.300354 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5tm\" (UniqueName: \"kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.402267 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.402312 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.402347 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5tm\" (UniqueName: \"kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.403459 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.410668 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.419303 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5tm\" (UniqueName: \"kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm\") pod \"collect-profiles-29411115-v7l6n\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:00 crc kubenswrapper[4895]: I1202 09:15:00.476165 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:01 crc kubenswrapper[4895]: I1202 09:15:01.004170 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n"] Dec 02 09:15:01 crc kubenswrapper[4895]: I1202 09:15:01.651613 4895 generic.go:334] "Generic (PLEG): container finished" podID="877abeb1-302a-42d0-8ba3-05353302b9fd" containerID="d67f5c6d47528a97f5915d40ace3b5638df9ff731d460837082fbe43a6dc6d6b" exitCode=0 Dec 02 09:15:01 crc kubenswrapper[4895]: I1202 09:15:01.651676 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" event={"ID":"877abeb1-302a-42d0-8ba3-05353302b9fd","Type":"ContainerDied","Data":"d67f5c6d47528a97f5915d40ace3b5638df9ff731d460837082fbe43a6dc6d6b"} Dec 02 09:15:01 crc kubenswrapper[4895]: I1202 09:15:01.652029 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" event={"ID":"877abeb1-302a-42d0-8ba3-05353302b9fd","Type":"ContainerStarted","Data":"6a9e15a980781b20a8d00a265687b9697689954606c7002c54c0336b9f13eac3"} Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.039634 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.166872 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume\") pod \"877abeb1-302a-42d0-8ba3-05353302b9fd\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.166977 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw5tm\" (UniqueName: \"kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm\") pod \"877abeb1-302a-42d0-8ba3-05353302b9fd\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.167028 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume\") pod \"877abeb1-302a-42d0-8ba3-05353302b9fd\" (UID: \"877abeb1-302a-42d0-8ba3-05353302b9fd\") " Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.169260 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "877abeb1-302a-42d0-8ba3-05353302b9fd" (UID: "877abeb1-302a-42d0-8ba3-05353302b9fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.172692 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "877abeb1-302a-42d0-8ba3-05353302b9fd" (UID: "877abeb1-302a-42d0-8ba3-05353302b9fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.178958 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm" (OuterVolumeSpecName: "kube-api-access-fw5tm") pod "877abeb1-302a-42d0-8ba3-05353302b9fd" (UID: "877abeb1-302a-42d0-8ba3-05353302b9fd"). InnerVolumeSpecName "kube-api-access-fw5tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.270502 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/877abeb1-302a-42d0-8ba3-05353302b9fd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.270537 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw5tm\" (UniqueName: \"kubernetes.io/projected/877abeb1-302a-42d0-8ba3-05353302b9fd-kube-api-access-fw5tm\") on node \"crc\" DevicePath \"\"" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.270547 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/877abeb1-302a-42d0-8ba3-05353302b9fd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.674871 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" event={"ID":"877abeb1-302a-42d0-8ba3-05353302b9fd","Type":"ContainerDied","Data":"6a9e15a980781b20a8d00a265687b9697689954606c7002c54c0336b9f13eac3"} Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.675261 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9e15a980781b20a8d00a265687b9697689954606c7002c54c0336b9f13eac3" Dec 02 09:15:03 crc kubenswrapper[4895]: I1202 09:15:03.674930 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n" Dec 02 09:15:04 crc kubenswrapper[4895]: I1202 09:15:04.133495 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9"] Dec 02 09:15:04 crc kubenswrapper[4895]: I1202 09:15:04.144341 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-4t7z9"] Dec 02 09:15:05 crc kubenswrapper[4895]: I1202 09:15:05.154367 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f3bf5b-6437-4d51-ab30-adef4a9f0753" path="/var/lib/kubelet/pods/52f3bf5b-6437-4d51-ab30-adef4a9f0753/volumes" Dec 02 09:15:38 crc kubenswrapper[4895]: I1202 09:15:38.278865 4895 scope.go:117] "RemoveContainer" containerID="857ea8d5b4f835cbbf213d62c5fa0279820ef248c2010fe8d06e9c9ff762ed94" Dec 02 09:16:10 crc kubenswrapper[4895]: I1202 09:16:10.044558 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-643c-account-create-update-tcbmr"] Dec 02 09:16:10 crc kubenswrapper[4895]: I1202 09:16:10.057253 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-r2hxs"] Dec 02 09:16:10 crc kubenswrapper[4895]: I1202 09:16:10.069083 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-643c-account-create-update-tcbmr"] Dec 02 09:16:10 crc kubenswrapper[4895]: I1202 09:16:10.079646 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-r2hxs"] Dec 02 09:16:11 crc kubenswrapper[4895]: I1202 09:16:11.157436 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d8163d-5dd1-428e-9b6c-ab2eeaa5a010" path="/var/lib/kubelet/pods/03d8163d-5dd1-428e-9b6c-ab2eeaa5a010/volumes" Dec 02 09:16:11 crc kubenswrapper[4895]: I1202 09:16:11.158597 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e655f0bd-0705-4b88-b0c8-d9c184c96f99" path="/var/lib/kubelet/pods/e655f0bd-0705-4b88-b0c8-d9c184c96f99/volumes" Dec 02 09:16:26 crc kubenswrapper[4895]: I1202 09:16:26.046469 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-g4ljc"] Dec 02 09:16:26 crc kubenswrapper[4895]: I1202 09:16:26.055401 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-g4ljc"] Dec 02 09:16:27 crc kubenswrapper[4895]: I1202 09:16:27.154030 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c787d13-9ab0-4db2-842f-51cf06ef21dd" path="/var/lib/kubelet/pods/2c787d13-9ab0-4db2-842f-51cf06ef21dd/volumes" Dec 02 09:16:38 crc kubenswrapper[4895]: I1202 09:16:38.366774 4895 scope.go:117] "RemoveContainer" containerID="47e7bd5c4c676f0ff821cbaa0d0e797e4abe7b4e57380d493827b73c46f73b70" Dec 02 09:16:38 crc kubenswrapper[4895]: I1202 09:16:38.394647 4895 scope.go:117] "RemoveContainer" containerID="9b13260e87d8a484148a6bf45a00d1cb6889f8c80e081413d6c1b7f1bcfb22e5" Dec 02 09:16:38 crc kubenswrapper[4895]: I1202 09:16:38.463268 4895 scope.go:117] "RemoveContainer" containerID="10c4c16cd26a64bae6114cb3265f70e04fa3f40bde709b3a1dc0b6e517392aa8" Dec 02 09:17:05 crc kubenswrapper[4895]: I1202 09:17:05.474175 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:17:05 crc kubenswrapper[4895]: I1202 09:17:05.475016 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.201501 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:25 crc kubenswrapper[4895]: E1202 09:17:25.202945 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877abeb1-302a-42d0-8ba3-05353302b9fd" containerName="collect-profiles" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.202962 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="877abeb1-302a-42d0-8ba3-05353302b9fd" containerName="collect-profiles" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.203210 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="877abeb1-302a-42d0-8ba3-05353302b9fd" containerName="collect-profiles" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.205160 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.219612 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.330387 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.330476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzdbh\" (UniqueName: \"kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.330655 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.433334 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.433513 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.433564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzdbh\" (UniqueName: \"kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.433861 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.433917 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.459423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzdbh\" (UniqueName: \"kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh\") pod \"certified-operators-6tlgx\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:25 crc kubenswrapper[4895]: I1202 09:17:25.541006 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:26 crc kubenswrapper[4895]: I1202 09:17:26.268593 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:27 crc kubenswrapper[4895]: I1202 09:17:27.168586 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerID="098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb" exitCode=0 Dec 02 09:17:27 crc kubenswrapper[4895]: I1202 09:17:27.168790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerDied","Data":"098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb"} Dec 02 09:17:27 crc kubenswrapper[4895]: I1202 09:17:27.168985 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerStarted","Data":"349fb33575fb02088f991e27185eec25b0e7ac51d94bc617b30791d0ff135c73"} Dec 02 09:17:29 crc kubenswrapper[4895]: I1202 09:17:29.195217 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerStarted","Data":"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba"} Dec 02 09:17:30 crc kubenswrapper[4895]: I1202 09:17:30.206677 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerID="537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba" exitCode=0 Dec 02 09:17:30 crc kubenswrapper[4895]: I1202 09:17:30.206778 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerDied","Data":"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba"} Dec 02 09:17:32 crc kubenswrapper[4895]: I1202 09:17:32.232726 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerStarted","Data":"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6"} Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.473723 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.474409 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.541522 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.541579 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.589053 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:35 crc kubenswrapper[4895]: I1202 09:17:35.613342 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tlgx" podStartSLOduration=6.775714107 podStartE2EDuration="10.613323451s" podCreationTimestamp="2025-12-02 09:17:25 +0000 UTC" firstStartedPulling="2025-12-02 09:17:27.172333732 +0000 UTC m=+6858.343193355" lastFinishedPulling="2025-12-02 09:17:31.009943086 +0000 UTC m=+6862.180802699" observedRunningTime="2025-12-02 09:17:32.272899085 +0000 UTC m=+6863.443758698" watchObservedRunningTime="2025-12-02 09:17:35.613323451 +0000 UTC m=+6866.784183064" Dec 02 09:17:36 crc kubenswrapper[4895]: I1202 09:17:36.333817 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:36 crc kubenswrapper[4895]: I1202 09:17:36.381580 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.301400 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tlgx" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="registry-server" containerID="cri-o://31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6" gracePeriod=2 Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.806420 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.978406 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content\") pod \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.978502 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzdbh\" (UniqueName: \"kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh\") pod \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.978650 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities\") pod \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\" (UID: \"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff\") " Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.979366 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities" (OuterVolumeSpecName: "utilities") pod "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" (UID: "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:17:38 crc kubenswrapper[4895]: I1202 09:17:38.986127 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh" (OuterVolumeSpecName: "kube-api-access-pzdbh") pod "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" (UID: "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff"). InnerVolumeSpecName "kube-api-access-pzdbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.029846 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" (UID: "3a53185b-d2a6-48cc-9dfe-e6d3f26580ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.080956 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzdbh\" (UniqueName: \"kubernetes.io/projected/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-kube-api-access-pzdbh\") on node \"crc\" DevicePath \"\"" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.080986 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.081025 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.316031 4895 generic.go:334] "Generic (PLEG): container finished" podID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerID="31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6" exitCode=0 Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.316076 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerDied","Data":"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6"} Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.316108 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlgx" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.316132 4895 scope.go:117] "RemoveContainer" containerID="31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.316104 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlgx" event={"ID":"3a53185b-d2a6-48cc-9dfe-e6d3f26580ff","Type":"ContainerDied","Data":"349fb33575fb02088f991e27185eec25b0e7ac51d94bc617b30791d0ff135c73"} Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.345115 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.352763 4895 scope.go:117] "RemoveContainer" containerID="537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.361011 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tlgx"] Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.379803 4895 scope.go:117] "RemoveContainer" containerID="098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.448022 4895 scope.go:117] "RemoveContainer" containerID="31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6" Dec 02 09:17:39 crc kubenswrapper[4895]: E1202 09:17:39.448680 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6\": container with ID starting with 31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6 not found: ID does not exist" containerID="31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.448871 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6"} err="failed to get container status \"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6\": rpc error: code = NotFound desc = could not find container \"31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6\": container with ID starting with 31ebe132f9b135bd03c082e8256e831cb14e5d309837248f1a5fcc33731130f6 not found: ID does not exist" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.448959 4895 scope.go:117] "RemoveContainer" containerID="537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba" Dec 02 09:17:39 crc kubenswrapper[4895]: E1202 09:17:39.449922 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba\": container with ID starting with 537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba not found: ID does not exist" containerID="537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.450170 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba"} err="failed to get container status \"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba\": rpc error: code = NotFound desc = could not find container \"537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba\": container with ID starting with 537c0a1ff872a138a2a1c25b3aa51c26ba557e3b03258cd67eb2a0b1dff39dba not found: ID does not exist" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.450204 4895 scope.go:117] "RemoveContainer" containerID="098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb" Dec 02 09:17:39 crc kubenswrapper[4895]: E1202 09:17:39.450527 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb\": container with ID starting with 098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb not found: ID does not exist" containerID="098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb" Dec 02 09:17:39 crc kubenswrapper[4895]: I1202 09:17:39.450565 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb"} err="failed to get container status \"098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb\": rpc error: code = NotFound desc = could not find container \"098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb\": container with ID starting with 098eb40fa846984af6ead81704f9e54f0a347f15cc37abb92c2b597b4131f0fb not found: ID does not exist" Dec 02 09:17:41 crc kubenswrapper[4895]: I1202 09:17:41.152288 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" path="/var/lib/kubelet/pods/3a53185b-d2a6-48cc-9dfe-e6d3f26580ff/volumes" Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.473303 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.473948 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.473997 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.474873 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.474928 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e" gracePeriod=600 Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.622282 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e" exitCode=0 Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.622339 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e"} Dec 02 09:18:05 crc kubenswrapper[4895]: I1202 09:18:05.622381 4895 scope.go:117] "RemoveContainer" containerID="d41b8f042da7f202a665565fffb111027944b6b1623344c1475552b27b519444" Dec 02 09:18:06 crc kubenswrapper[4895]: I1202 09:18:06.641581 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a"} Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.051795 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1c7c-account-create-update-dmjp8"] Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.065231 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-qjtt9"] Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.075242 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1c7c-account-create-update-dmjp8"] Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.083395 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-qjtt9"] Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.155997 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f267667-a26a-45c5-ab52-4d3b50b3ad17" path="/var/lib/kubelet/pods/4f267667-a26a-45c5-ab52-4d3b50b3ad17/volumes" Dec 02 09:18:41 crc kubenswrapper[4895]: I1202 09:18:41.156617 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e11aaedb-75a3-4da5-b86e-b358bd041d4e" path="/var/lib/kubelet/pods/e11aaedb-75a3-4da5-b86e-b358bd041d4e/volumes" Dec 02 09:18:53 crc kubenswrapper[4895]: I1202 09:18:53.048115 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-dwb4g"] Dec 02 09:18:53 crc kubenswrapper[4895]: I1202 09:18:53.059340 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-dwb4g"] Dec 02 09:18:53 crc kubenswrapper[4895]: I1202 09:18:53.160930 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2734fba4-4ac9-425b-8c0e-68702868de3e" path="/var/lib/kubelet/pods/2734fba4-4ac9-425b-8c0e-68702868de3e/volumes" Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.039910 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-5htr7"] Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.049716 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-90d7-account-create-update-8786c"] Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.060513 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-90d7-account-create-update-8786c"] Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.069557 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-5htr7"] Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.153011 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c628159-a2bd-4d35-b4a5-9d9e1e588259" path="/var/lib/kubelet/pods/5c628159-a2bd-4d35-b4a5-9d9e1e588259/volumes" Dec 02 09:19:19 crc kubenswrapper[4895]: I1202 09:19:19.153607 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc12c784-6db5-424b-b4e9-a7c0ce42ebb1" path="/var/lib/kubelet/pods/fc12c784-6db5-424b-b4e9-a7c0ce42ebb1/volumes" Dec 02 09:19:31 crc kubenswrapper[4895]: I1202 09:19:31.035643 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-f55lv"] Dec 02 09:19:31 crc kubenswrapper[4895]: I1202 09:19:31.045792 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-f55lv"] Dec 02 09:19:31 crc kubenswrapper[4895]: I1202 09:19:31.156377 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="780ef2e7-bdb3-4c45-98bc-64659b5a19a6" path="/var/lib/kubelet/pods/780ef2e7-bdb3-4c45-98bc-64659b5a19a6/volumes" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.617040 4895 scope.go:117] "RemoveContainer" containerID="fecf3f6e6d9a0db6f9fe64d641b47f8c3c68938466b6dd0098465b48c3b548ae" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.643843 4895 scope.go:117] "RemoveContainer" containerID="947435df6cfbae4ded1a38aee306494fed9102fb6231d255e5fe692ddac36119" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.689402 4895 scope.go:117] "RemoveContainer" containerID="3725a56d05f97712d364b9fb0fe73ef743ce2be2d199c44a30baf4677c6018ab" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.744976 4895 scope.go:117] "RemoveContainer" containerID="dd7f74a0de2f532d393eb84b88aca17b7252df402340db56ac66ea3ef532630a" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.801130 4895 scope.go:117] "RemoveContainer" containerID="903e854846e2e475f2fdfc57d3ee2aa0cab38c1cf3d4137e7bed182ae0077a3c" Dec 02 09:19:38 crc kubenswrapper[4895]: I1202 09:19:38.835321 4895 scope.go:117] "RemoveContainer" containerID="db39de8422dbc6f6c72e457cdc99b283ab01473e32740b50d8ef0608dee5bbd2" Dec 02 09:20:05 crc kubenswrapper[4895]: I1202 09:20:05.474207 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:20:05 crc kubenswrapper[4895]: I1202 09:20:05.474938 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:20:35 crc kubenswrapper[4895]: I1202 09:20:35.473291 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:20:35 crc kubenswrapper[4895]: I1202 09:20:35.473979 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:21:05 crc kubenswrapper[4895]: I1202 09:21:05.473608 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:21:05 crc kubenswrapper[4895]: I1202 09:21:05.474363 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:21:05 crc kubenswrapper[4895]: I1202 09:21:05.474525 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:21:05 crc kubenswrapper[4895]: I1202 09:21:05.476048 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:21:05 crc kubenswrapper[4895]: I1202 09:21:05.476156 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" gracePeriod=600 Dec 02 09:21:05 crc kubenswrapper[4895]: E1202 09:21:05.597657 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:21:06 crc kubenswrapper[4895]: I1202 09:21:06.571484 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" exitCode=0 Dec 02 09:21:06 crc kubenswrapper[4895]: I1202 09:21:06.571553 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a"} Dec 02 09:21:06 crc kubenswrapper[4895]: I1202 09:21:06.571963 4895 scope.go:117] "RemoveContainer" containerID="30d1f1705497456a9817e446749ca57dc7425e94bd89bbebe71606626901b02e" Dec 02 09:21:06 crc kubenswrapper[4895]: I1202 09:21:06.572879 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:21:06 crc kubenswrapper[4895]: E1202 09:21:06.573401 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:21:18 crc kubenswrapper[4895]: I1202 09:21:18.141662 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:21:18 crc kubenswrapper[4895]: E1202 09:21:18.142635 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:21:29 crc kubenswrapper[4895]: I1202 09:21:29.148407 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:21:29 crc kubenswrapper[4895]: E1202 09:21:29.149311 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:21:44 crc kubenswrapper[4895]: I1202 09:21:44.140516 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:21:44 crc kubenswrapper[4895]: E1202 09:21:44.141361 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:21:56 crc kubenswrapper[4895]: I1202 09:21:56.142332 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:21:56 crc kubenswrapper[4895]: E1202 09:21:56.143623 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:22:09 crc kubenswrapper[4895]: I1202 09:22:09.150005 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:22:09 crc kubenswrapper[4895]: E1202 09:22:09.152039 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:22:20 crc kubenswrapper[4895]: I1202 09:22:20.141507 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:22:20 crc kubenswrapper[4895]: E1202 09:22:20.142332 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:22:35 crc kubenswrapper[4895]: I1202 09:22:34.141491 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:22:35 crc kubenswrapper[4895]: E1202 09:22:34.142426 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:22:37 crc kubenswrapper[4895]: I1202 09:22:37.745841 4895 generic.go:334] "Generic (PLEG): container finished" podID="fdefa681-0c42-4f77-81e9-19fc3ae7a940" containerID="98b6665c802e1a61fb8552f44a2b8476061bd6f8a199478f50fc7abc6a3d74ba" exitCode=0 Dec 02 09:22:37 crc kubenswrapper[4895]: I1202 09:22:37.745953 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" event={"ID":"fdefa681-0c42-4f77-81e9-19fc3ae7a940","Type":"ContainerDied","Data":"98b6665c802e1a61fb8552f44a2b8476061bd6f8a199478f50fc7abc6a3d74ba"} Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.275187 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.416639 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory\") pod \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.417092 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph\") pod \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.417232 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdn4g\" (UniqueName: \"kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g\") pod \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.417401 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle\") pod \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.417514 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key\") pod \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\" (UID: \"fdefa681-0c42-4f77-81e9-19fc3ae7a940\") " Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.422982 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g" (OuterVolumeSpecName: "kube-api-access-cdn4g") pod "fdefa681-0c42-4f77-81e9-19fc3ae7a940" (UID: "fdefa681-0c42-4f77-81e9-19fc3ae7a940"). InnerVolumeSpecName "kube-api-access-cdn4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.423234 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph" (OuterVolumeSpecName: "ceph") pod "fdefa681-0c42-4f77-81e9-19fc3ae7a940" (UID: "fdefa681-0c42-4f77-81e9-19fc3ae7a940"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.423900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "fdefa681-0c42-4f77-81e9-19fc3ae7a940" (UID: "fdefa681-0c42-4f77-81e9-19fc3ae7a940"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.448776 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory" (OuterVolumeSpecName: "inventory") pod "fdefa681-0c42-4f77-81e9-19fc3ae7a940" (UID: "fdefa681-0c42-4f77-81e9-19fc3ae7a940"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.451803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fdefa681-0c42-4f77-81e9-19fc3ae7a940" (UID: "fdefa681-0c42-4f77-81e9-19fc3ae7a940"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.520978 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdn4g\" (UniqueName: \"kubernetes.io/projected/fdefa681-0c42-4f77-81e9-19fc3ae7a940-kube-api-access-cdn4g\") on node \"crc\" DevicePath \"\"" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.521004 4895 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.521015 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.521029 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.521072 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fdefa681-0c42-4f77-81e9-19fc3ae7a940-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.767618 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" event={"ID":"fdefa681-0c42-4f77-81e9-19fc3ae7a940","Type":"ContainerDied","Data":"70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d"} Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.767935 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70956572a642ed3133800206992cfb5ece42628fc8380022df75a9f6ecb1905d" Dec 02 09:22:39 crc kubenswrapper[4895]: I1202 09:22:39.767692 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k" Dec 02 09:22:48 crc kubenswrapper[4895]: I1202 09:22:48.141395 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:22:48 crc kubenswrapper[4895]: E1202 09:22:48.142209 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.425065 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wf5kx"] Dec 02 09:22:50 crc kubenswrapper[4895]: E1202 09:22:50.432152 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="registry-server" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.432216 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="registry-server" Dec 02 09:22:50 crc kubenswrapper[4895]: E1202 09:22:50.432312 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="extract-content" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.432325 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="extract-content" Dec 02 09:22:50 crc kubenswrapper[4895]: E1202 09:22:50.432382 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdefa681-0c42-4f77-81e9-19fc3ae7a940" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.432395 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdefa681-0c42-4f77-81e9-19fc3ae7a940" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 02 09:22:50 crc kubenswrapper[4895]: E1202 09:22:50.432423 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="extract-utilities" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.432432 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="extract-utilities" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.433348 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a53185b-d2a6-48cc-9dfe-e6d3f26580ff" containerName="registry-server" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.433384 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdefa681-0c42-4f77-81e9-19fc3ae7a940" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.451701 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.458010 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.458325 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.458479 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.458619 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.469138 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wf5kx"] Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.503253 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.503359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84r6w\" (UniqueName: \"kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.503423 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.503534 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.503573 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.605944 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84r6w\" (UniqueName: \"kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.606374 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.606542 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.606593 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.606781 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.629821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.629847 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.629982 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.634020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.639878 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84r6w\" (UniqueName: \"kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w\") pod \"bootstrap-openstack-openstack-cell1-wf5kx\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:50 crc kubenswrapper[4895]: I1202 09:22:50.796711 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:22:51 crc kubenswrapper[4895]: I1202 09:22:51.342172 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-wf5kx"] Dec 02 09:22:51 crc kubenswrapper[4895]: I1202 09:22:51.346165 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:22:51 crc kubenswrapper[4895]: I1202 09:22:51.878365 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" event={"ID":"f199c54a-28da-4ea4-a95b-4ab810484ce2","Type":"ContainerStarted","Data":"75dd58c990f5f010b392a5e6bb948c333233189d40fcb78c9cd9f3477ec1a09a"} Dec 02 09:22:51 crc kubenswrapper[4895]: I1202 09:22:51.878983 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" event={"ID":"f199c54a-28da-4ea4-a95b-4ab810484ce2","Type":"ContainerStarted","Data":"5492da27179340a549a6ace09cd8a79ba00971f9887ccac1dee7d7670ccf896e"} Dec 02 09:22:51 crc kubenswrapper[4895]: I1202 09:22:51.903074 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" podStartSLOduration=1.677689253 podStartE2EDuration="1.903049979s" podCreationTimestamp="2025-12-02 09:22:50 +0000 UTC" firstStartedPulling="2025-12-02 09:22:51.345942123 +0000 UTC m=+7182.516801736" lastFinishedPulling="2025-12-02 09:22:51.571302849 +0000 UTC m=+7182.742162462" observedRunningTime="2025-12-02 09:22:51.893725188 +0000 UTC m=+7183.064584801" watchObservedRunningTime="2025-12-02 09:22:51.903049979 +0000 UTC m=+7183.073909592" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.388072 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.393157 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.445937 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.446288 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxtd\" (UniqueName: \"kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.446519 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.449279 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.548538 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.548690 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.548839 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxtd\" (UniqueName: \"kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.550019 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.550244 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.572152 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxtd\" (UniqueName: \"kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd\") pod \"community-operators-vbppf\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:52 crc kubenswrapper[4895]: I1202 09:22:52.715268 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:22:53 crc kubenswrapper[4895]: I1202 09:22:53.250889 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:22:53 crc kubenswrapper[4895]: W1202 09:22:53.254029 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f34657a_6e56_4339_b216_62324cc3a035.slice/crio-f033499757cffc14f13c9b780c48f251e60d5818b0cfc933052463b1b3d95a94 WatchSource:0}: Error finding container f033499757cffc14f13c9b780c48f251e60d5818b0cfc933052463b1b3d95a94: Status 404 returned error can't find the container with id f033499757cffc14f13c9b780c48f251e60d5818b0cfc933052463b1b3d95a94 Dec 02 09:22:53 crc kubenswrapper[4895]: I1202 09:22:53.903673 4895 generic.go:334] "Generic (PLEG): container finished" podID="9f34657a-6e56-4339-b216-62324cc3a035" containerID="d9c1b86c9060cc1e93ed0e13afad7ac1c3967d6a79d0150fd3511df1132ad1e9" exitCode=0 Dec 02 09:22:53 crc kubenswrapper[4895]: I1202 09:22:53.904280 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerDied","Data":"d9c1b86c9060cc1e93ed0e13afad7ac1c3967d6a79d0150fd3511df1132ad1e9"} Dec 02 09:22:53 crc kubenswrapper[4895]: I1202 09:22:53.904323 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerStarted","Data":"f033499757cffc14f13c9b780c48f251e60d5818b0cfc933052463b1b3d95a94"} Dec 02 09:22:59 crc kubenswrapper[4895]: I1202 09:22:59.969434 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerStarted","Data":"524efd430056389ab4fa1fe2a69ea9d87ad4fa55822eb5e771debf6455520e4b"} Dec 02 09:23:01 crc kubenswrapper[4895]: I1202 09:23:01.129425 4895 generic.go:334] "Generic (PLEG): container finished" podID="9f34657a-6e56-4339-b216-62324cc3a035" containerID="524efd430056389ab4fa1fe2a69ea9d87ad4fa55822eb5e771debf6455520e4b" exitCode=0 Dec 02 09:23:01 crc kubenswrapper[4895]: I1202 09:23:01.129484 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerDied","Data":"524efd430056389ab4fa1fe2a69ea9d87ad4fa55822eb5e771debf6455520e4b"} Dec 02 09:23:03 crc kubenswrapper[4895]: I1202 09:23:03.141656 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:23:03 crc kubenswrapper[4895]: E1202 09:23:03.142300 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:23:04 crc kubenswrapper[4895]: I1202 09:23:04.162384 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerStarted","Data":"5da0c1b32beb49bc65e67e49dca893447665329e4fb9778aa60c6b086e221cc4"} Dec 02 09:23:04 crc kubenswrapper[4895]: I1202 09:23:04.186037 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbppf" podStartSLOduration=2.2933087580000002 podStartE2EDuration="12.186015433s" podCreationTimestamp="2025-12-02 09:22:52 +0000 UTC" firstStartedPulling="2025-12-02 09:22:53.907109604 +0000 UTC m=+7185.077969227" lastFinishedPulling="2025-12-02 09:23:03.799816289 +0000 UTC m=+7194.970675902" observedRunningTime="2025-12-02 09:23:04.181242385 +0000 UTC m=+7195.352102018" watchObservedRunningTime="2025-12-02 09:23:04.186015433 +0000 UTC m=+7195.356875046" Dec 02 09:23:10 crc kubenswrapper[4895]: I1202 09:23:10.971090 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:10 crc kubenswrapper[4895]: I1202 09:23:10.974364 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:10 crc kubenswrapper[4895]: I1202 09:23:10.990369 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.167144 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpd22\" (UniqueName: \"kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.167647 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.167717 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.271573 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpd22\" (UniqueName: \"kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.271658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.271694 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.272419 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.272423 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.299586 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpd22\" (UniqueName: \"kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22\") pod \"redhat-operators-vxzkl\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.374312 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:11 crc kubenswrapper[4895]: I1202 09:23:11.920537 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:11 crc kubenswrapper[4895]: W1202 09:23:11.929224 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aced989_7854_40f8_b59c_b0978c21d3f2.slice/crio-1a066d68eaa57c4998f4b1d7b7936c011f21d3f0872b008bace844b2f87d5c75 WatchSource:0}: Error finding container 1a066d68eaa57c4998f4b1d7b7936c011f21d3f0872b008bace844b2f87d5c75: Status 404 returned error can't find the container with id 1a066d68eaa57c4998f4b1d7b7936c011f21d3f0872b008bace844b2f87d5c75 Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.245459 4895 generic.go:334] "Generic (PLEG): container finished" podID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerID="e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca" exitCode=0 Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.245591 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerDied","Data":"e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca"} Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.245823 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerStarted","Data":"1a066d68eaa57c4998f4b1d7b7936c011f21d3f0872b008bace844b2f87d5c75"} Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.715825 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.715894 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:12 crc kubenswrapper[4895]: I1202 09:23:12.771453 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:13 crc kubenswrapper[4895]: I1202 09:23:13.307872 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:14 crc kubenswrapper[4895]: I1202 09:23:14.141250 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:23:14 crc kubenswrapper[4895]: E1202 09:23:14.141612 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:23:14 crc kubenswrapper[4895]: I1202 09:23:14.266365 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerStarted","Data":"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82"} Dec 02 09:23:15 crc kubenswrapper[4895]: I1202 09:23:15.155503 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:23:15 crc kubenswrapper[4895]: I1202 09:23:15.280265 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbppf" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="registry-server" containerID="cri-o://5da0c1b32beb49bc65e67e49dca893447665329e4fb9778aa60c6b086e221cc4" gracePeriod=2 Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.302693 4895 generic.go:334] "Generic (PLEG): container finished" podID="9f34657a-6e56-4339-b216-62324cc3a035" containerID="5da0c1b32beb49bc65e67e49dca893447665329e4fb9778aa60c6b086e221cc4" exitCode=0 Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.302778 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerDied","Data":"5da0c1b32beb49bc65e67e49dca893447665329e4fb9778aa60c6b086e221cc4"} Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.824448 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.898097 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities\") pod \"9f34657a-6e56-4339-b216-62324cc3a035\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.898336 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlxtd\" (UniqueName: \"kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd\") pod \"9f34657a-6e56-4339-b216-62324cc3a035\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.898402 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content\") pod \"9f34657a-6e56-4339-b216-62324cc3a035\" (UID: \"9f34657a-6e56-4339-b216-62324cc3a035\") " Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.900410 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities" (OuterVolumeSpecName: "utilities") pod "9f34657a-6e56-4339-b216-62324cc3a035" (UID: "9f34657a-6e56-4339-b216-62324cc3a035"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.904243 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd" (OuterVolumeSpecName: "kube-api-access-vlxtd") pod "9f34657a-6e56-4339-b216-62324cc3a035" (UID: "9f34657a-6e56-4339-b216-62324cc3a035"). InnerVolumeSpecName "kube-api-access-vlxtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:23:16 crc kubenswrapper[4895]: I1202 09:23:16.942430 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f34657a-6e56-4339-b216-62324cc3a035" (UID: "9f34657a-6e56-4339-b216-62324cc3a035"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.000394 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlxtd\" (UniqueName: \"kubernetes.io/projected/9f34657a-6e56-4339-b216-62324cc3a035-kube-api-access-vlxtd\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.000438 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.000448 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f34657a-6e56-4339-b216-62324cc3a035-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.327445 4895 generic.go:334] "Generic (PLEG): container finished" podID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerID="1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82" exitCode=0 Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.327529 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerDied","Data":"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82"} Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.334443 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbppf" event={"ID":"9f34657a-6e56-4339-b216-62324cc3a035","Type":"ContainerDied","Data":"f033499757cffc14f13c9b780c48f251e60d5818b0cfc933052463b1b3d95a94"} Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.334495 4895 scope.go:117] "RemoveContainer" containerID="5da0c1b32beb49bc65e67e49dca893447665329e4fb9778aa60c6b086e221cc4" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.334681 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbppf" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.373316 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.378512 4895 scope.go:117] "RemoveContainer" containerID="524efd430056389ab4fa1fe2a69ea9d87ad4fa55822eb5e771debf6455520e4b" Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.381315 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbppf"] Dec 02 09:23:17 crc kubenswrapper[4895]: I1202 09:23:17.401859 4895 scope.go:117] "RemoveContainer" containerID="d9c1b86c9060cc1e93ed0e13afad7ac1c3967d6a79d0150fd3511df1132ad1e9" Dec 02 09:23:18 crc kubenswrapper[4895]: I1202 09:23:18.345445 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerStarted","Data":"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0"} Dec 02 09:23:18 crc kubenswrapper[4895]: I1202 09:23:18.369462 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vxzkl" podStartSLOduration=2.663680102 podStartE2EDuration="8.369434318s" podCreationTimestamp="2025-12-02 09:23:10 +0000 UTC" firstStartedPulling="2025-12-02 09:23:12.248456764 +0000 UTC m=+7203.419316377" lastFinishedPulling="2025-12-02 09:23:17.95421098 +0000 UTC m=+7209.125070593" observedRunningTime="2025-12-02 09:23:18.364477404 +0000 UTC m=+7209.535337017" watchObservedRunningTime="2025-12-02 09:23:18.369434318 +0000 UTC m=+7209.540293931" Dec 02 09:23:19 crc kubenswrapper[4895]: I1202 09:23:19.152718 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f34657a-6e56-4339-b216-62324cc3a035" path="/var/lib/kubelet/pods/9f34657a-6e56-4339-b216-62324cc3a035/volumes" Dec 02 09:23:21 crc kubenswrapper[4895]: I1202 09:23:21.375062 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:21 crc kubenswrapper[4895]: I1202 09:23:21.375433 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:22 crc kubenswrapper[4895]: I1202 09:23:22.428492 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vxzkl" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="registry-server" probeResult="failure" output=< Dec 02 09:23:22 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:23:22 crc kubenswrapper[4895]: > Dec 02 09:23:26 crc kubenswrapper[4895]: I1202 09:23:26.140878 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:23:26 crc kubenswrapper[4895]: E1202 09:23:26.141748 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:23:31 crc kubenswrapper[4895]: I1202 09:23:31.431987 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:31 crc kubenswrapper[4895]: I1202 09:23:31.495060 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:31 crc kubenswrapper[4895]: I1202 09:23:31.678696 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:32 crc kubenswrapper[4895]: I1202 09:23:32.753952 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vxzkl" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="registry-server" containerID="cri-o://8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0" gracePeriod=2 Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.318213 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.448039 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities\") pod \"6aced989-7854-40f8-b59c-b0978c21d3f2\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.448461 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content\") pod \"6aced989-7854-40f8-b59c-b0978c21d3f2\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.448565 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpd22\" (UniqueName: \"kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22\") pod \"6aced989-7854-40f8-b59c-b0978c21d3f2\" (UID: \"6aced989-7854-40f8-b59c-b0978c21d3f2\") " Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.450081 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities" (OuterVolumeSpecName: "utilities") pod "6aced989-7854-40f8-b59c-b0978c21d3f2" (UID: "6aced989-7854-40f8-b59c-b0978c21d3f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.457658 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22" (OuterVolumeSpecName: "kube-api-access-tpd22") pod "6aced989-7854-40f8-b59c-b0978c21d3f2" (UID: "6aced989-7854-40f8-b59c-b0978c21d3f2"). InnerVolumeSpecName "kube-api-access-tpd22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.551259 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.551307 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpd22\" (UniqueName: \"kubernetes.io/projected/6aced989-7854-40f8-b59c-b0978c21d3f2-kube-api-access-tpd22\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.609584 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6aced989-7854-40f8-b59c-b0978c21d3f2" (UID: "6aced989-7854-40f8-b59c-b0978c21d3f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.652719 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6aced989-7854-40f8-b59c-b0978c21d3f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.765971 4895 generic.go:334] "Generic (PLEG): container finished" podID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerID="8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0" exitCode=0 Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.766014 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerDied","Data":"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0"} Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.766042 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vxzkl" event={"ID":"6aced989-7854-40f8-b59c-b0978c21d3f2","Type":"ContainerDied","Data":"1a066d68eaa57c4998f4b1d7b7936c011f21d3f0872b008bace844b2f87d5c75"} Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.766060 4895 scope.go:117] "RemoveContainer" containerID="8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.766148 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vxzkl" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.791936 4895 scope.go:117] "RemoveContainer" containerID="1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.817913 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.827969 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vxzkl"] Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.839054 4895 scope.go:117] "RemoveContainer" containerID="e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.885159 4895 scope.go:117] "RemoveContainer" containerID="8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0" Dec 02 09:23:33 crc kubenswrapper[4895]: E1202 09:23:33.885761 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0\": container with ID starting with 8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0 not found: ID does not exist" containerID="8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.885887 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0"} err="failed to get container status \"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0\": rpc error: code = NotFound desc = could not find container \"8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0\": container with ID starting with 8b93de79013837892c1c2885eada98b876605c28492da3972fb431b1420ac8e0 not found: ID does not exist" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.886000 4895 scope.go:117] "RemoveContainer" containerID="1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82" Dec 02 09:23:33 crc kubenswrapper[4895]: E1202 09:23:33.886395 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82\": container with ID starting with 1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82 not found: ID does not exist" containerID="1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.886501 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82"} err="failed to get container status \"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82\": rpc error: code = NotFound desc = could not find container \"1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82\": container with ID starting with 1495919ec0f44cb1b3c413e09c4402eb97426bac5970bd4b781144b969da5f82 not found: ID does not exist" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.886595 4895 scope.go:117] "RemoveContainer" containerID="e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca" Dec 02 09:23:33 crc kubenswrapper[4895]: E1202 09:23:33.887043 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca\": container with ID starting with e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca not found: ID does not exist" containerID="e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca" Dec 02 09:23:33 crc kubenswrapper[4895]: I1202 09:23:33.887140 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca"} err="failed to get container status \"e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca\": rpc error: code = NotFound desc = could not find container \"e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca\": container with ID starting with e0e82f1ba9362becb8925d76c031dd8d00a25aa988405ed61d4aaaa485717bca not found: ID does not exist" Dec 02 09:23:35 crc kubenswrapper[4895]: I1202 09:23:35.153913 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" path="/var/lib/kubelet/pods/6aced989-7854-40f8-b59c-b0978c21d3f2/volumes" Dec 02 09:23:40 crc kubenswrapper[4895]: I1202 09:23:40.141846 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:23:40 crc kubenswrapper[4895]: E1202 09:23:40.142840 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:23:51 crc kubenswrapper[4895]: I1202 09:23:51.141556 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:23:51 crc kubenswrapper[4895]: E1202 09:23:51.143622 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:24:02 crc kubenswrapper[4895]: I1202 09:24:02.141145 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:24:02 crc kubenswrapper[4895]: E1202 09:24:02.143138 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:24:13 crc kubenswrapper[4895]: I1202 09:24:13.143673 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:24:13 crc kubenswrapper[4895]: E1202 09:24:13.144629 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:24:25 crc kubenswrapper[4895]: I1202 09:24:25.142055 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:24:25 crc kubenswrapper[4895]: E1202 09:24:25.143112 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:24:38 crc kubenswrapper[4895]: I1202 09:24:38.141679 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:24:38 crc kubenswrapper[4895]: E1202 09:24:38.142577 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:24:51 crc kubenswrapper[4895]: I1202 09:24:51.141590 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:24:51 crc kubenswrapper[4895]: E1202 09:24:51.142474 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:06 crc kubenswrapper[4895]: I1202 09:25:06.141256 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:25:06 crc kubenswrapper[4895]: E1202 09:25:06.142449 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:19 crc kubenswrapper[4895]: I1202 09:25:19.155262 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:25:19 crc kubenswrapper[4895]: E1202 09:25:19.156387 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:30 crc kubenswrapper[4895]: I1202 09:25:30.142053 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:25:30 crc kubenswrapper[4895]: E1202 09:25:30.143341 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:45 crc kubenswrapper[4895]: I1202 09:25:45.142443 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:25:45 crc kubenswrapper[4895]: E1202 09:25:45.143699 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:57 crc kubenswrapper[4895]: I1202 09:25:57.141465 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:25:57 crc kubenswrapper[4895]: E1202 09:25:57.142354 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:25:57 crc kubenswrapper[4895]: I1202 09:25:57.241845 4895 generic.go:334] "Generic (PLEG): container finished" podID="f199c54a-28da-4ea4-a95b-4ab810484ce2" containerID="75dd58c990f5f010b392a5e6bb948c333233189d40fcb78c9cd9f3477ec1a09a" exitCode=0 Dec 02 09:25:57 crc kubenswrapper[4895]: I1202 09:25:57.241892 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" event={"ID":"f199c54a-28da-4ea4-a95b-4ab810484ce2","Type":"ContainerDied","Data":"75dd58c990f5f010b392a5e6bb948c333233189d40fcb78c9cd9f3477ec1a09a"} Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.712674 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.819888 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory\") pod \"f199c54a-28da-4ea4-a95b-4ab810484ce2\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.820020 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84r6w\" (UniqueName: \"kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w\") pod \"f199c54a-28da-4ea4-a95b-4ab810484ce2\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.820080 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key\") pod \"f199c54a-28da-4ea4-a95b-4ab810484ce2\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.820223 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph\") pod \"f199c54a-28da-4ea4-a95b-4ab810484ce2\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.820306 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle\") pod \"f199c54a-28da-4ea4-a95b-4ab810484ce2\" (UID: \"f199c54a-28da-4ea4-a95b-4ab810484ce2\") " Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.825843 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph" (OuterVolumeSpecName: "ceph") pod "f199c54a-28da-4ea4-a95b-4ab810484ce2" (UID: "f199c54a-28da-4ea4-a95b-4ab810484ce2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.825932 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f199c54a-28da-4ea4-a95b-4ab810484ce2" (UID: "f199c54a-28da-4ea4-a95b-4ab810484ce2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.826035 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w" (OuterVolumeSpecName: "kube-api-access-84r6w") pod "f199c54a-28da-4ea4-a95b-4ab810484ce2" (UID: "f199c54a-28da-4ea4-a95b-4ab810484ce2"). InnerVolumeSpecName "kube-api-access-84r6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.853187 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f199c54a-28da-4ea4-a95b-4ab810484ce2" (UID: "f199c54a-28da-4ea4-a95b-4ab810484ce2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.854053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory" (OuterVolumeSpecName: "inventory") pod "f199c54a-28da-4ea4-a95b-4ab810484ce2" (UID: "f199c54a-28da-4ea4-a95b-4ab810484ce2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.922255 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84r6w\" (UniqueName: \"kubernetes.io/projected/f199c54a-28da-4ea4-a95b-4ab810484ce2-kube-api-access-84r6w\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.922288 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.922301 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.922310 4895 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:58 crc kubenswrapper[4895]: I1202 09:25:58.922318 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f199c54a-28da-4ea4-a95b-4ab810484ce2-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.263917 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" event={"ID":"f199c54a-28da-4ea4-a95b-4ab810484ce2","Type":"ContainerDied","Data":"5492da27179340a549a6ace09cd8a79ba00971f9887ccac1dee7d7670ccf896e"} Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.263964 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5492da27179340a549a6ace09cd8a79ba00971f9887ccac1dee7d7670ccf896e" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.264017 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-wf5kx" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.348787 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-pklz4"] Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349690 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349716 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349732 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="extract-utilities" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349852 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="extract-utilities" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349881 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="extract-utilities" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349892 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="extract-utilities" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349918 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="extract-content" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349927 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="extract-content" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349938 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="extract-content" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349947 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="extract-content" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.349979 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.349988 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: E1202 09:25:59.350010 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f199c54a-28da-4ea4-a95b-4ab810484ce2" containerName="bootstrap-openstack-openstack-cell1" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.350018 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f199c54a-28da-4ea4-a95b-4ab810484ce2" containerName="bootstrap-openstack-openstack-cell1" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.350288 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aced989-7854-40f8-b59c-b0978c21d3f2" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.350319 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f34657a-6e56-4339-b216-62324cc3a035" containerName="registry-server" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.350346 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f199c54a-28da-4ea4-a95b-4ab810484ce2" containerName="bootstrap-openstack-openstack-cell1" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.351302 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.354844 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.354910 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.354865 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.355041 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.360950 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-pklz4"] Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.433437 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.433900 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8k6l\" (UniqueName: \"kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.433949 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.434116 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.536316 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8k6l\" (UniqueName: \"kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.536404 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.536471 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.536561 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.541716 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.542032 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.542282 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.552659 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8k6l\" (UniqueName: \"kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l\") pod \"download-cache-openstack-openstack-cell1-pklz4\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:25:59 crc kubenswrapper[4895]: I1202 09:25:59.674235 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:26:00 crc kubenswrapper[4895]: I1202 09:26:00.206381 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-pklz4"] Dec 02 09:26:00 crc kubenswrapper[4895]: I1202 09:26:00.277646 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" event={"ID":"a4af478f-ce76-4ed3-9adc-93d1ae521565","Type":"ContainerStarted","Data":"61b50871c30c0f385a229d1e082930954bce6a7696f07f4923fe43f121c4cbfe"} Dec 02 09:26:01 crc kubenswrapper[4895]: I1202 09:26:01.298735 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" event={"ID":"a4af478f-ce76-4ed3-9adc-93d1ae521565","Type":"ContainerStarted","Data":"fe15b81385bc63be85f8a9a3eebdd711e0c16494aaad85446f6381ea9fac8ce2"} Dec 02 09:26:10 crc kubenswrapper[4895]: I1202 09:26:10.140984 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:26:10 crc kubenswrapper[4895]: I1202 09:26:10.392465 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b"} Dec 02 09:26:10 crc kubenswrapper[4895]: I1202 09:26:10.415022 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" podStartSLOduration=11.273794443 podStartE2EDuration="11.415003349s" podCreationTimestamp="2025-12-02 09:25:59 +0000 UTC" firstStartedPulling="2025-12-02 09:26:00.209845075 +0000 UTC m=+7371.380704688" lastFinishedPulling="2025-12-02 09:26:00.351053981 +0000 UTC m=+7371.521913594" observedRunningTime="2025-12-02 09:26:01.315359195 +0000 UTC m=+7372.486218798" watchObservedRunningTime="2025-12-02 09:26:10.415003349 +0000 UTC m=+7381.585862962" Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.881790 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.885614 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.899043 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.986059 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.986203 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7bb7\" (UniqueName: \"kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:32 crc kubenswrapper[4895]: I1202 09:27:32.986244 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.088299 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7bb7\" (UniqueName: \"kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.088386 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.088564 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.088945 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.089037 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.116073 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7bb7\" (UniqueName: \"kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7\") pod \"certified-operators-dsdxm\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:33 crc kubenswrapper[4895]: I1202 09:27:33.219754 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:34 crc kubenswrapper[4895]: I1202 09:27:34.301030 4895 generic.go:334] "Generic (PLEG): container finished" podID="a4af478f-ce76-4ed3-9adc-93d1ae521565" containerID="fe15b81385bc63be85f8a9a3eebdd711e0c16494aaad85446f6381ea9fac8ce2" exitCode=0 Dec 02 09:27:34 crc kubenswrapper[4895]: I1202 09:27:34.301092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" event={"ID":"a4af478f-ce76-4ed3-9adc-93d1ae521565","Type":"ContainerDied","Data":"fe15b81385bc63be85f8a9a3eebdd711e0c16494aaad85446f6381ea9fac8ce2"} Dec 02 09:27:34 crc kubenswrapper[4895]: I1202 09:27:34.415467 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.311636 4895 generic.go:334] "Generic (PLEG): container finished" podID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerID="243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa" exitCode=0 Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.311708 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerDied","Data":"243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa"} Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.312182 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerStarted","Data":"7dfafe1ebdfce746f9bdf3fbe26c230271409aee51710d6bfcc8270a9471437d"} Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.793359 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.953289 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8k6l\" (UniqueName: \"kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l\") pod \"a4af478f-ce76-4ed3-9adc-93d1ae521565\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.953488 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory\") pod \"a4af478f-ce76-4ed3-9adc-93d1ae521565\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.953544 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key\") pod \"a4af478f-ce76-4ed3-9adc-93d1ae521565\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.953584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph\") pod \"a4af478f-ce76-4ed3-9adc-93d1ae521565\" (UID: \"a4af478f-ce76-4ed3-9adc-93d1ae521565\") " Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.960680 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph" (OuterVolumeSpecName: "ceph") pod "a4af478f-ce76-4ed3-9adc-93d1ae521565" (UID: "a4af478f-ce76-4ed3-9adc-93d1ae521565"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.961081 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l" (OuterVolumeSpecName: "kube-api-access-l8k6l") pod "a4af478f-ce76-4ed3-9adc-93d1ae521565" (UID: "a4af478f-ce76-4ed3-9adc-93d1ae521565"). InnerVolumeSpecName "kube-api-access-l8k6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.985933 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory" (OuterVolumeSpecName: "inventory") pod "a4af478f-ce76-4ed3-9adc-93d1ae521565" (UID: "a4af478f-ce76-4ed3-9adc-93d1ae521565"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:35 crc kubenswrapper[4895]: I1202 09:27:35.987621 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4af478f-ce76-4ed3-9adc-93d1ae521565" (UID: "a4af478f-ce76-4ed3-9adc-93d1ae521565"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.056264 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8k6l\" (UniqueName: \"kubernetes.io/projected/a4af478f-ce76-4ed3-9adc-93d1ae521565-kube-api-access-l8k6l\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.056313 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.056323 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.056332 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4af478f-ce76-4ed3-9adc-93d1ae521565-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.324684 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" event={"ID":"a4af478f-ce76-4ed3-9adc-93d1ae521565","Type":"ContainerDied","Data":"61b50871c30c0f385a229d1e082930954bce6a7696f07f4923fe43f121c4cbfe"} Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.325017 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61b50871c30c0f385a229d1e082930954bce6a7696f07f4923fe43f121c4cbfe" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.324749 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-pklz4" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.406691 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vsmxx"] Dec 02 09:27:36 crc kubenswrapper[4895]: E1202 09:27:36.407209 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4af478f-ce76-4ed3-9adc-93d1ae521565" containerName="download-cache-openstack-openstack-cell1" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.407224 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4af478f-ce76-4ed3-9adc-93d1ae521565" containerName="download-cache-openstack-openstack-cell1" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.407443 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4af478f-ce76-4ed3-9adc-93d1ae521565" containerName="download-cache-openstack-openstack-cell1" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.408238 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.411243 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.411511 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.412475 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.412600 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.422408 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vsmxx"] Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.580638 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.581597 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.581764 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.581976 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkmh7\" (UniqueName: \"kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.684213 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkmh7\" (UniqueName: \"kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.684349 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.684551 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.684601 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.690332 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.690852 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.691131 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.702387 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkmh7\" (UniqueName: \"kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7\") pod \"configure-network-openstack-openstack-cell1-vsmxx\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:36 crc kubenswrapper[4895]: I1202 09:27:36.749024 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:27:37 crc kubenswrapper[4895]: I1202 09:27:37.325258 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vsmxx"] Dec 02 09:27:37 crc kubenswrapper[4895]: I1202 09:27:37.341003 4895 generic.go:334] "Generic (PLEG): container finished" podID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerID="ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4" exitCode=0 Dec 02 09:27:37 crc kubenswrapper[4895]: I1202 09:27:37.341127 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerDied","Data":"ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4"} Dec 02 09:27:37 crc kubenswrapper[4895]: I1202 09:27:37.345538 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" event={"ID":"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8","Type":"ContainerStarted","Data":"92884aecde53b5c7e2f61029f468e560ce426078c2a812ea917d81a14ac8eba2"} Dec 02 09:27:38 crc kubenswrapper[4895]: I1202 09:27:38.361378 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerStarted","Data":"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5"} Dec 02 09:27:38 crc kubenswrapper[4895]: I1202 09:27:38.365212 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" event={"ID":"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8","Type":"ContainerStarted","Data":"04e553174233d93e642d1548923c6317b008716c0bef0ae968e7a8be9f5fb2cd"} Dec 02 09:27:38 crc kubenswrapper[4895]: I1202 09:27:38.388168 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dsdxm" podStartSLOduration=3.729774791 podStartE2EDuration="6.388151219s" podCreationTimestamp="2025-12-02 09:27:32 +0000 UTC" firstStartedPulling="2025-12-02 09:27:35.31349213 +0000 UTC m=+7466.484351733" lastFinishedPulling="2025-12-02 09:27:37.971868548 +0000 UTC m=+7469.142728161" observedRunningTime="2025-12-02 09:27:38.388011894 +0000 UTC m=+7469.558871507" watchObservedRunningTime="2025-12-02 09:27:38.388151219 +0000 UTC m=+7469.559010832" Dec 02 09:27:38 crc kubenswrapper[4895]: I1202 09:27:38.415564 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" podStartSLOduration=2.236568419 podStartE2EDuration="2.415541701s" podCreationTimestamp="2025-12-02 09:27:36 +0000 UTC" firstStartedPulling="2025-12-02 09:27:37.325971668 +0000 UTC m=+7468.496831281" lastFinishedPulling="2025-12-02 09:27:37.50494495 +0000 UTC m=+7468.675804563" observedRunningTime="2025-12-02 09:27:38.404684864 +0000 UTC m=+7469.575544487" watchObservedRunningTime="2025-12-02 09:27:38.415541701 +0000 UTC m=+7469.586401314" Dec 02 09:27:43 crc kubenswrapper[4895]: I1202 09:27:43.220125 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:43 crc kubenswrapper[4895]: I1202 09:27:43.221979 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:43 crc kubenswrapper[4895]: I1202 09:27:43.274583 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:43 crc kubenswrapper[4895]: I1202 09:27:43.462872 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:43 crc kubenswrapper[4895]: I1202 09:27:43.512730 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:45 crc kubenswrapper[4895]: I1202 09:27:45.434704 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dsdxm" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="registry-server" containerID="cri-o://3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5" gracePeriod=2 Dec 02 09:27:45 crc kubenswrapper[4895]: I1202 09:27:45.963032 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.097269 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities\") pod \"0542d319-ca0e-4530-9161-32009bf1d8bf\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.097470 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content\") pod \"0542d319-ca0e-4530-9161-32009bf1d8bf\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.097584 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7bb7\" (UniqueName: \"kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7\") pod \"0542d319-ca0e-4530-9161-32009bf1d8bf\" (UID: \"0542d319-ca0e-4530-9161-32009bf1d8bf\") " Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.098363 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities" (OuterVolumeSpecName: "utilities") pod "0542d319-ca0e-4530-9161-32009bf1d8bf" (UID: "0542d319-ca0e-4530-9161-32009bf1d8bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.103544 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7" (OuterVolumeSpecName: "kube-api-access-s7bb7") pod "0542d319-ca0e-4530-9161-32009bf1d8bf" (UID: "0542d319-ca0e-4530-9161-32009bf1d8bf"). InnerVolumeSpecName "kube-api-access-s7bb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.144129 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0542d319-ca0e-4530-9161-32009bf1d8bf" (UID: "0542d319-ca0e-4530-9161-32009bf1d8bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.200936 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.200975 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7bb7\" (UniqueName: \"kubernetes.io/projected/0542d319-ca0e-4530-9161-32009bf1d8bf-kube-api-access-s7bb7\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.200989 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0542d319-ca0e-4530-9161-32009bf1d8bf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.456940 4895 generic.go:334] "Generic (PLEG): container finished" podID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerID="3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5" exitCode=0 Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.457006 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerDied","Data":"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5"} Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.457329 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dsdxm" event={"ID":"0542d319-ca0e-4530-9161-32009bf1d8bf","Type":"ContainerDied","Data":"7dfafe1ebdfce746f9bdf3fbe26c230271409aee51710d6bfcc8270a9471437d"} Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.457353 4895 scope.go:117] "RemoveContainer" containerID="3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.457033 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dsdxm" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.496660 4895 scope.go:117] "RemoveContainer" containerID="ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.498029 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.511685 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dsdxm"] Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.535948 4895 scope.go:117] "RemoveContainer" containerID="243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.568024 4895 scope.go:117] "RemoveContainer" containerID="3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5" Dec 02 09:27:46 crc kubenswrapper[4895]: E1202 09:27:46.568487 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5\": container with ID starting with 3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5 not found: ID does not exist" containerID="3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.568518 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5"} err="failed to get container status \"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5\": rpc error: code = NotFound desc = could not find container \"3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5\": container with ID starting with 3d594b8c1a2f95887739221ecc6d0b9bf8ef30123022538b71a8d1e79be8bdf5 not found: ID does not exist" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.568540 4895 scope.go:117] "RemoveContainer" containerID="ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4" Dec 02 09:27:46 crc kubenswrapper[4895]: E1202 09:27:46.568878 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4\": container with ID starting with ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4 not found: ID does not exist" containerID="ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.568897 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4"} err="failed to get container status \"ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4\": rpc error: code = NotFound desc = could not find container \"ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4\": container with ID starting with ae489fe74357f507116b3badf13151c292351fe25ef90bf6ce74e917eb9381c4 not found: ID does not exist" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.568910 4895 scope.go:117] "RemoveContainer" containerID="243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa" Dec 02 09:27:46 crc kubenswrapper[4895]: E1202 09:27:46.569149 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa\": container with ID starting with 243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa not found: ID does not exist" containerID="243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa" Dec 02 09:27:46 crc kubenswrapper[4895]: I1202 09:27:46.569166 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa"} err="failed to get container status \"243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa\": rpc error: code = NotFound desc = could not find container \"243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa\": container with ID starting with 243a54802e42876836630ec78cc296d5e21d5e254be3fb7fc03f4cce56c4d0fa not found: ID does not exist" Dec 02 09:27:47 crc kubenswrapper[4895]: I1202 09:27:47.172267 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" path="/var/lib/kubelet/pods/0542d319-ca0e-4530-9161-32009bf1d8bf/volumes" Dec 02 09:28:35 crc kubenswrapper[4895]: I1202 09:28:35.473960 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:28:35 crc kubenswrapper[4895]: I1202 09:28:35.474447 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:28:55 crc kubenswrapper[4895]: I1202 09:28:55.216353 4895 generic.go:334] "Generic (PLEG): container finished" podID="2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" containerID="04e553174233d93e642d1548923c6317b008716c0bef0ae968e7a8be9f5fb2cd" exitCode=0 Dec 02 09:28:55 crc kubenswrapper[4895]: I1202 09:28:55.216463 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" event={"ID":"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8","Type":"ContainerDied","Data":"04e553174233d93e642d1548923c6317b008716c0bef0ae968e7a8be9f5fb2cd"} Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.836319 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.944615 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key\") pod \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.944658 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph\") pod \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.944711 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory\") pod \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.944855 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkmh7\" (UniqueName: \"kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7\") pod \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\" (UID: \"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8\") " Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.951023 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph" (OuterVolumeSpecName: "ceph") pod "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" (UID: "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.951134 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7" (OuterVolumeSpecName: "kube-api-access-lkmh7") pod "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" (UID: "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8"). InnerVolumeSpecName "kube-api-access-lkmh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.982771 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory" (OuterVolumeSpecName: "inventory") pod "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" (UID: "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:28:56 crc kubenswrapper[4895]: I1202 09:28:56.997306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" (UID: "2a1105f6-57ce-4e6d-a62b-1f1dbb777da8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.047782 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkmh7\" (UniqueName: \"kubernetes.io/projected/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-kube-api-access-lkmh7\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.048302 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.048383 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.048512 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a1105f6-57ce-4e6d-a62b-1f1dbb777da8-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.237689 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" event={"ID":"2a1105f6-57ce-4e6d-a62b-1f1dbb777da8","Type":"ContainerDied","Data":"92884aecde53b5c7e2f61029f468e560ce426078c2a812ea917d81a14ac8eba2"} Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.237778 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92884aecde53b5c7e2f61029f468e560ce426078c2a812ea917d81a14ac8eba2" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.237809 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vsmxx" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.318532 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4h564"] Dec 02 09:28:57 crc kubenswrapper[4895]: E1202 09:28:57.318999 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="extract-utilities" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319017 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="extract-utilities" Dec 02 09:28:57 crc kubenswrapper[4895]: E1202 09:28:57.319045 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="registry-server" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319053 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="registry-server" Dec 02 09:28:57 crc kubenswrapper[4895]: E1202 09:28:57.319090 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" containerName="configure-network-openstack-openstack-cell1" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319101 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" containerName="configure-network-openstack-openstack-cell1" Dec 02 09:28:57 crc kubenswrapper[4895]: E1202 09:28:57.319114 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="extract-content" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319119 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="extract-content" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319307 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1105f6-57ce-4e6d-a62b-1f1dbb777da8" containerName="configure-network-openstack-openstack-cell1" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.319336 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0542d319-ca0e-4530-9161-32009bf1d8bf" containerName="registry-server" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.320131 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.330604 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.331261 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.331520 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.336098 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.336551 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4h564"] Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.359653 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.359828 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.359942 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.360259 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4tnn\" (UniqueName: \"kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.461874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4tnn\" (UniqueName: \"kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.461970 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.462024 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.462068 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.466993 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.467404 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.467625 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.489008 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4tnn\" (UniqueName: \"kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn\") pod \"validate-network-openstack-openstack-cell1-4h564\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:57 crc kubenswrapper[4895]: I1202 09:28:57.643675 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:28:58 crc kubenswrapper[4895]: I1202 09:28:58.145471 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-4h564"] Dec 02 09:28:58 crc kubenswrapper[4895]: I1202 09:28:58.153776 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:28:58 crc kubenswrapper[4895]: I1202 09:28:58.249200 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4h564" event={"ID":"7fb97644-658c-4072-8e03-a89589d95cf5","Type":"ContainerStarted","Data":"ca2dd7c53d427392bc9d0bbe1256c0162bddb731bd33c4e8d056f55c2202c4d1"} Dec 02 09:29:00 crc kubenswrapper[4895]: I1202 09:29:00.292150 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4h564" event={"ID":"7fb97644-658c-4072-8e03-a89589d95cf5","Type":"ContainerStarted","Data":"22d82863d23516bbea55f2371829ec88d8ebf61ce7a733d1fbf9f9ba86513d18"} Dec 02 09:29:00 crc kubenswrapper[4895]: I1202 09:29:00.331099 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-4h564" podStartSLOduration=2.470754706 podStartE2EDuration="3.331073182s" podCreationTimestamp="2025-12-02 09:28:57 +0000 UTC" firstStartedPulling="2025-12-02 09:28:58.153543555 +0000 UTC m=+7549.324403168" lastFinishedPulling="2025-12-02 09:28:59.013862031 +0000 UTC m=+7550.184721644" observedRunningTime="2025-12-02 09:29:00.317657894 +0000 UTC m=+7551.488517537" watchObservedRunningTime="2025-12-02 09:29:00.331073182 +0000 UTC m=+7551.501932795" Dec 02 09:29:04 crc kubenswrapper[4895]: I1202 09:29:04.333107 4895 generic.go:334] "Generic (PLEG): container finished" podID="7fb97644-658c-4072-8e03-a89589d95cf5" containerID="22d82863d23516bbea55f2371829ec88d8ebf61ce7a733d1fbf9f9ba86513d18" exitCode=0 Dec 02 09:29:04 crc kubenswrapper[4895]: I1202 09:29:04.333218 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4h564" event={"ID":"7fb97644-658c-4072-8e03-a89589d95cf5","Type":"ContainerDied","Data":"22d82863d23516bbea55f2371829ec88d8ebf61ce7a733d1fbf9f9ba86513d18"} Dec 02 09:29:05 crc kubenswrapper[4895]: I1202 09:29:05.473629 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:29:05 crc kubenswrapper[4895]: I1202 09:29:05.474115 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:29:05 crc kubenswrapper[4895]: I1202 09:29:05.870167 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.007915 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory\") pod \"7fb97644-658c-4072-8e03-a89589d95cf5\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.008468 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key\") pod \"7fb97644-658c-4072-8e03-a89589d95cf5\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.008578 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4tnn\" (UniqueName: \"kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn\") pod \"7fb97644-658c-4072-8e03-a89589d95cf5\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.008857 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph\") pod \"7fb97644-658c-4072-8e03-a89589d95cf5\" (UID: \"7fb97644-658c-4072-8e03-a89589d95cf5\") " Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.016160 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn" (OuterVolumeSpecName: "kube-api-access-n4tnn") pod "7fb97644-658c-4072-8e03-a89589d95cf5" (UID: "7fb97644-658c-4072-8e03-a89589d95cf5"). InnerVolumeSpecName "kube-api-access-n4tnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.016171 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph" (OuterVolumeSpecName: "ceph") pod "7fb97644-658c-4072-8e03-a89589d95cf5" (UID: "7fb97644-658c-4072-8e03-a89589d95cf5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.045306 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory" (OuterVolumeSpecName: "inventory") pod "7fb97644-658c-4072-8e03-a89589d95cf5" (UID: "7fb97644-658c-4072-8e03-a89589d95cf5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.045488 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7fb97644-658c-4072-8e03-a89589d95cf5" (UID: "7fb97644-658c-4072-8e03-a89589d95cf5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.112147 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4tnn\" (UniqueName: \"kubernetes.io/projected/7fb97644-658c-4072-8e03-a89589d95cf5-kube-api-access-n4tnn\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.112199 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.112215 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.112228 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7fb97644-658c-4072-8e03-a89589d95cf5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.355125 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-4h564" event={"ID":"7fb97644-658c-4072-8e03-a89589d95cf5","Type":"ContainerDied","Data":"ca2dd7c53d427392bc9d0bbe1256c0162bddb731bd33c4e8d056f55c2202c4d1"} Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.355170 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2dd7c53d427392bc9d0bbe1256c0162bddb731bd33c4e8d056f55c2202c4d1" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.355194 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-4h564" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.440274 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kpnd9"] Dec 02 09:29:06 crc kubenswrapper[4895]: E1202 09:29:06.441018 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb97644-658c-4072-8e03-a89589d95cf5" containerName="validate-network-openstack-openstack-cell1" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.441046 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb97644-658c-4072-8e03-a89589d95cf5" containerName="validate-network-openstack-openstack-cell1" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.441355 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb97644-658c-4072-8e03-a89589d95cf5" containerName="validate-network-openstack-openstack-cell1" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.442443 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.449524 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.449769 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.449988 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.450160 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.454655 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kpnd9"] Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.625213 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.625326 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.625373 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.625451 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmxm\" (UniqueName: \"kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.728879 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.728968 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.729005 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.729051 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmxm\" (UniqueName: \"kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.734639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.734639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.735931 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.747703 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmxm\" (UniqueName: \"kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm\") pod \"install-os-openstack-openstack-cell1-kpnd9\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:06 crc kubenswrapper[4895]: I1202 09:29:06.763806 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:07 crc kubenswrapper[4895]: I1202 09:29:07.342537 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kpnd9"] Dec 02 09:29:07 crc kubenswrapper[4895]: I1202 09:29:07.367314 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" event={"ID":"56d195ca-712e-42b8-b755-ed605d04f09d","Type":"ContainerStarted","Data":"aa9c6796563bb0a8b36e2e45273bc5cd17553e8c21f27573ac1a199eb0703df3"} Dec 02 09:29:08 crc kubenswrapper[4895]: I1202 09:29:08.380696 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" event={"ID":"56d195ca-712e-42b8-b755-ed605d04f09d","Type":"ContainerStarted","Data":"134b52af3180a19fb6b11cd42e34dbcd8c089ca08dc246e923933412418c38b5"} Dec 02 09:29:08 crc kubenswrapper[4895]: I1202 09:29:08.400917 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" podStartSLOduration=2.204056754 podStartE2EDuration="2.400894112s" podCreationTimestamp="2025-12-02 09:29:06 +0000 UTC" firstStartedPulling="2025-12-02 09:29:07.336425681 +0000 UTC m=+7558.507285294" lastFinishedPulling="2025-12-02 09:29:07.533263039 +0000 UTC m=+7558.704122652" observedRunningTime="2025-12-02 09:29:08.398936331 +0000 UTC m=+7559.569795955" watchObservedRunningTime="2025-12-02 09:29:08.400894112 +0000 UTC m=+7559.571753725" Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.473423 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.473989 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.474042 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.474649 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.474701 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b" gracePeriod=600 Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.638220 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b" exitCode=0 Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.638366 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b"} Dec 02 09:29:35 crc kubenswrapper[4895]: I1202 09:29:35.638475 4895 scope.go:117] "RemoveContainer" containerID="b0891d6e2df87d9528c2e39e6da0ad0c4eeb270463ef0e1dee0e8c959775531a" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.172054 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.175275 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.182982 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.279540 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8m26\" (UniqueName: \"kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.279664 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.279726 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.382227 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8m26\" (UniqueName: \"kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.382379 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.382403 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.383154 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.383182 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.402547 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8m26\" (UniqueName: \"kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26\") pod \"redhat-marketplace-m895k\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.524712 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:36 crc kubenswrapper[4895]: I1202 09:29:36.659699 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b"} Dec 02 09:29:37 crc kubenswrapper[4895]: I1202 09:29:37.114637 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:37 crc kubenswrapper[4895]: W1202 09:29:37.131954 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83999926_e77c_4967_8a13_274c292e9b22.slice/crio-bd1b1095d745bcdd02fd254da67fed58c1a6f602435bb753c10be5f902d75ce5 WatchSource:0}: Error finding container bd1b1095d745bcdd02fd254da67fed58c1a6f602435bb753c10be5f902d75ce5: Status 404 returned error can't find the container with id bd1b1095d745bcdd02fd254da67fed58c1a6f602435bb753c10be5f902d75ce5 Dec 02 09:29:37 crc kubenswrapper[4895]: I1202 09:29:37.684794 4895 generic.go:334] "Generic (PLEG): container finished" podID="83999926-e77c-4967-8a13-274c292e9b22" containerID="34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1" exitCode=0 Dec 02 09:29:37 crc kubenswrapper[4895]: I1202 09:29:37.684880 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerDied","Data":"34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1"} Dec 02 09:29:37 crc kubenswrapper[4895]: I1202 09:29:37.685996 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerStarted","Data":"bd1b1095d745bcdd02fd254da67fed58c1a6f602435bb753c10be5f902d75ce5"} Dec 02 09:29:39 crc kubenswrapper[4895]: I1202 09:29:39.706291 4895 generic.go:334] "Generic (PLEG): container finished" podID="83999926-e77c-4967-8a13-274c292e9b22" containerID="005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9" exitCode=0 Dec 02 09:29:39 crc kubenswrapper[4895]: I1202 09:29:39.706330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerDied","Data":"005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9"} Dec 02 09:29:40 crc kubenswrapper[4895]: I1202 09:29:40.718488 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerStarted","Data":"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168"} Dec 02 09:29:40 crc kubenswrapper[4895]: I1202 09:29:40.739256 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m895k" podStartSLOduration=2.218028758 podStartE2EDuration="4.739232334s" podCreationTimestamp="2025-12-02 09:29:36 +0000 UTC" firstStartedPulling="2025-12-02 09:29:37.686447717 +0000 UTC m=+7588.857307320" lastFinishedPulling="2025-12-02 09:29:40.207651283 +0000 UTC m=+7591.378510896" observedRunningTime="2025-12-02 09:29:40.734674972 +0000 UTC m=+7591.905534605" watchObservedRunningTime="2025-12-02 09:29:40.739232334 +0000 UTC m=+7591.910091967" Dec 02 09:29:46 crc kubenswrapper[4895]: I1202 09:29:46.525530 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:46 crc kubenswrapper[4895]: I1202 09:29:46.526302 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:46 crc kubenswrapper[4895]: I1202 09:29:46.587614 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:46 crc kubenswrapper[4895]: I1202 09:29:46.846494 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:46 crc kubenswrapper[4895]: I1202 09:29:46.901454 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:48 crc kubenswrapper[4895]: I1202 09:29:48.808195 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m895k" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="registry-server" containerID="cri-o://fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168" gracePeriod=2 Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.343351 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.474934 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8m26\" (UniqueName: \"kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26\") pod \"83999926-e77c-4967-8a13-274c292e9b22\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.475131 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content\") pod \"83999926-e77c-4967-8a13-274c292e9b22\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.475232 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities\") pod \"83999926-e77c-4967-8a13-274c292e9b22\" (UID: \"83999926-e77c-4967-8a13-274c292e9b22\") " Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.476600 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities" (OuterVolumeSpecName: "utilities") pod "83999926-e77c-4967-8a13-274c292e9b22" (UID: "83999926-e77c-4967-8a13-274c292e9b22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.481991 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26" (OuterVolumeSpecName: "kube-api-access-l8m26") pod "83999926-e77c-4967-8a13-274c292e9b22" (UID: "83999926-e77c-4967-8a13-274c292e9b22"). InnerVolumeSpecName "kube-api-access-l8m26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.496649 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83999926-e77c-4967-8a13-274c292e9b22" (UID: "83999926-e77c-4967-8a13-274c292e9b22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.577474 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.577854 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83999926-e77c-4967-8a13-274c292e9b22-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.577866 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8m26\" (UniqueName: \"kubernetes.io/projected/83999926-e77c-4967-8a13-274c292e9b22-kube-api-access-l8m26\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.819868 4895 generic.go:334] "Generic (PLEG): container finished" podID="83999926-e77c-4967-8a13-274c292e9b22" containerID="fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168" exitCode=0 Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.819943 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerDied","Data":"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168"} Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.819999 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m895k" event={"ID":"83999926-e77c-4967-8a13-274c292e9b22","Type":"ContainerDied","Data":"bd1b1095d745bcdd02fd254da67fed58c1a6f602435bb753c10be5f902d75ce5"} Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.820029 4895 scope.go:117] "RemoveContainer" containerID="fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.819948 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m895k" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.863297 4895 scope.go:117] "RemoveContainer" containerID="005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.867642 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.876011 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m895k"] Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.886542 4895 scope.go:117] "RemoveContainer" containerID="34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.945958 4895 scope.go:117] "RemoveContainer" containerID="fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168" Dec 02 09:29:49 crc kubenswrapper[4895]: E1202 09:29:49.946486 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168\": container with ID starting with fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168 not found: ID does not exist" containerID="fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.946545 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168"} err="failed to get container status \"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168\": rpc error: code = NotFound desc = could not find container \"fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168\": container with ID starting with fd09975a4cd11e0683d754f6f0519e96005babd45e4657db0495633aaca18168 not found: ID does not exist" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.946579 4895 scope.go:117] "RemoveContainer" containerID="005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9" Dec 02 09:29:49 crc kubenswrapper[4895]: E1202 09:29:49.947030 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9\": container with ID starting with 005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9 not found: ID does not exist" containerID="005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.947060 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9"} err="failed to get container status \"005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9\": rpc error: code = NotFound desc = could not find container \"005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9\": container with ID starting with 005e0384e6b4fda3776036aedb1cb0d4a910186fe3448391ed5e969a2b7666a9 not found: ID does not exist" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.947083 4895 scope.go:117] "RemoveContainer" containerID="34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1" Dec 02 09:29:49 crc kubenswrapper[4895]: E1202 09:29:49.947356 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1\": container with ID starting with 34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1 not found: ID does not exist" containerID="34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1" Dec 02 09:29:49 crc kubenswrapper[4895]: I1202 09:29:49.947391 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1"} err="failed to get container status \"34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1\": rpc error: code = NotFound desc = could not find container \"34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1\": container with ID starting with 34c6cd34b7a91a81a8b7f09a2703f607966a91d5a3669472ba05fa11b97c24b1 not found: ID does not exist" Dec 02 09:29:51 crc kubenswrapper[4895]: I1202 09:29:51.153392 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83999926-e77c-4967-8a13-274c292e9b22" path="/var/lib/kubelet/pods/83999926-e77c-4967-8a13-274c292e9b22/volumes" Dec 02 09:29:55 crc kubenswrapper[4895]: I1202 09:29:55.894222 4895 generic.go:334] "Generic (PLEG): container finished" podID="56d195ca-712e-42b8-b755-ed605d04f09d" containerID="134b52af3180a19fb6b11cd42e34dbcd8c089ca08dc246e923933412418c38b5" exitCode=0 Dec 02 09:29:55 crc kubenswrapper[4895]: I1202 09:29:55.894324 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" event={"ID":"56d195ca-712e-42b8-b755-ed605d04f09d","Type":"ContainerDied","Data":"134b52af3180a19fb6b11cd42e34dbcd8c089ca08dc246e923933412418c38b5"} Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.428719 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.446210 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key\") pod \"56d195ca-712e-42b8-b755-ed605d04f09d\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.446702 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kmxm\" (UniqueName: \"kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm\") pod \"56d195ca-712e-42b8-b755-ed605d04f09d\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.446797 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory\") pod \"56d195ca-712e-42b8-b755-ed605d04f09d\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.446893 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph\") pod \"56d195ca-712e-42b8-b755-ed605d04f09d\" (UID: \"56d195ca-712e-42b8-b755-ed605d04f09d\") " Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.452570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph" (OuterVolumeSpecName: "ceph") pod "56d195ca-712e-42b8-b755-ed605d04f09d" (UID: "56d195ca-712e-42b8-b755-ed605d04f09d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.453013 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm" (OuterVolumeSpecName: "kube-api-access-9kmxm") pod "56d195ca-712e-42b8-b755-ed605d04f09d" (UID: "56d195ca-712e-42b8-b755-ed605d04f09d"). InnerVolumeSpecName "kube-api-access-9kmxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.480696 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory" (OuterVolumeSpecName: "inventory") pod "56d195ca-712e-42b8-b755-ed605d04f09d" (UID: "56d195ca-712e-42b8-b755-ed605d04f09d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.484268 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "56d195ca-712e-42b8-b755-ed605d04f09d" (UID: "56d195ca-712e-42b8-b755-ed605d04f09d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.550083 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kmxm\" (UniqueName: \"kubernetes.io/projected/56d195ca-712e-42b8-b755-ed605d04f09d-kube-api-access-9kmxm\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.550125 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.550135 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.550143 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56d195ca-712e-42b8-b755-ed605d04f09d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.915704 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" event={"ID":"56d195ca-712e-42b8-b755-ed605d04f09d","Type":"ContainerDied","Data":"aa9c6796563bb0a8b36e2e45273bc5cd17553e8c21f27573ac1a199eb0703df3"} Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.915973 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9c6796563bb0a8b36e2e45273bc5cd17553e8c21f27573ac1a199eb0703df3" Dec 02 09:29:57 crc kubenswrapper[4895]: I1202 09:29:57.916961 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kpnd9" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.010131 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-z7wst"] Dec 02 09:29:58 crc kubenswrapper[4895]: E1202 09:29:58.010681 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="extract-utilities" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.010696 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="extract-utilities" Dec 02 09:29:58 crc kubenswrapper[4895]: E1202 09:29:58.010715 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="registry-server" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.010722 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="registry-server" Dec 02 09:29:58 crc kubenswrapper[4895]: E1202 09:29:58.010734 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d195ca-712e-42b8-b755-ed605d04f09d" containerName="install-os-openstack-openstack-cell1" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.010764 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d195ca-712e-42b8-b755-ed605d04f09d" containerName="install-os-openstack-openstack-cell1" Dec 02 09:29:58 crc kubenswrapper[4895]: E1202 09:29:58.010812 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="extract-content" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.010819 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="extract-content" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.011044 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d195ca-712e-42b8-b755-ed605d04f09d" containerName="install-os-openstack-openstack-cell1" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.011064 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="83999926-e77c-4967-8a13-274c292e9b22" containerName="registry-server" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.012035 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.014863 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.015120 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.015401 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.015593 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.045693 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-z7wst"] Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.061609 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.061659 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.061699 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.061723 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmkm\" (UniqueName: \"kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.163926 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.163968 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.164001 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.164020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmkm\" (UniqueName: \"kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.168353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.168820 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.168908 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.180880 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmkm\" (UniqueName: \"kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm\") pod \"configure-os-openstack-openstack-cell1-z7wst\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.338586 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.880628 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-z7wst"] Dec 02 09:29:58 crc kubenswrapper[4895]: I1202 09:29:58.927818 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" event={"ID":"366b5800-e486-4b22-9e3a-4d0c86356cd0","Type":"ContainerStarted","Data":"9041a8431238dfa2a2c23f1a7f118a157760bf1b38d0a4ea6667f51741ad10dc"} Dec 02 09:29:59 crc kubenswrapper[4895]: I1202 09:29:59.938303 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" event={"ID":"366b5800-e486-4b22-9e3a-4d0c86356cd0","Type":"ContainerStarted","Data":"f1ec2781b08f551a88f72ca01716a2100712fb7a527937e49424e214adb17ae6"} Dec 02 09:29:59 crc kubenswrapper[4895]: I1202 09:29:59.977064 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" podStartSLOduration=2.80112015 podStartE2EDuration="2.977030326s" podCreationTimestamp="2025-12-02 09:29:57 +0000 UTC" firstStartedPulling="2025-12-02 09:29:58.87967759 +0000 UTC m=+7610.050537203" lastFinishedPulling="2025-12-02 09:29:59.055587766 +0000 UTC m=+7610.226447379" observedRunningTime="2025-12-02 09:29:59.963777993 +0000 UTC m=+7611.134637646" watchObservedRunningTime="2025-12-02 09:29:59.977030326 +0000 UTC m=+7611.147889959" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.161615 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm"] Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.163237 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.165961 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.166327 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.177145 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm"] Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.208873 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p85sh\" (UniqueName: \"kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.209072 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.209626 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.310436 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.310808 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p85sh\" (UniqueName: \"kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.310860 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.312758 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.322601 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.336333 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p85sh\" (UniqueName: \"kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh\") pod \"collect-profiles-29411130-z2skm\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:00 crc kubenswrapper[4895]: I1202 09:30:00.497787 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:01 crc kubenswrapper[4895]: I1202 09:30:01.185427 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm"] Dec 02 09:30:01 crc kubenswrapper[4895]: I1202 09:30:01.961153 4895 generic.go:334] "Generic (PLEG): container finished" podID="ecbb554b-7c1f-4475-83ff-8184cc72986b" containerID="b8b3946e05073aa58ebcb6cd2f96cb0fc31f4a49336abf33b930a0bb61f82989" exitCode=0 Dec 02 09:30:01 crc kubenswrapper[4895]: I1202 09:30:01.961279 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" event={"ID":"ecbb554b-7c1f-4475-83ff-8184cc72986b","Type":"ContainerDied","Data":"b8b3946e05073aa58ebcb6cd2f96cb0fc31f4a49336abf33b930a0bb61f82989"} Dec 02 09:30:01 crc kubenswrapper[4895]: I1202 09:30:01.961687 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" event={"ID":"ecbb554b-7c1f-4475-83ff-8184cc72986b","Type":"ContainerStarted","Data":"6368a65c678e4cd42d0a4523e4b376e85b69f11faf4db252ec562b10500da2a1"} Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.317807 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.420664 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p85sh\" (UniqueName: \"kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh\") pod \"ecbb554b-7c1f-4475-83ff-8184cc72986b\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.420759 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume\") pod \"ecbb554b-7c1f-4475-83ff-8184cc72986b\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.420878 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume\") pod \"ecbb554b-7c1f-4475-83ff-8184cc72986b\" (UID: \"ecbb554b-7c1f-4475-83ff-8184cc72986b\") " Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.421676 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume" (OuterVolumeSpecName: "config-volume") pod "ecbb554b-7c1f-4475-83ff-8184cc72986b" (UID: "ecbb554b-7c1f-4475-83ff-8184cc72986b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.426619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh" (OuterVolumeSpecName: "kube-api-access-p85sh") pod "ecbb554b-7c1f-4475-83ff-8184cc72986b" (UID: "ecbb554b-7c1f-4475-83ff-8184cc72986b"). InnerVolumeSpecName "kube-api-access-p85sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.426840 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ecbb554b-7c1f-4475-83ff-8184cc72986b" (UID: "ecbb554b-7c1f-4475-83ff-8184cc72986b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.523404 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecbb554b-7c1f-4475-83ff-8184cc72986b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.523714 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p85sh\" (UniqueName: \"kubernetes.io/projected/ecbb554b-7c1f-4475-83ff-8184cc72986b-kube-api-access-p85sh\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.523725 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecbb554b-7c1f-4475-83ff-8184cc72986b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.985806 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" event={"ID":"ecbb554b-7c1f-4475-83ff-8184cc72986b","Type":"ContainerDied","Data":"6368a65c678e4cd42d0a4523e4b376e85b69f11faf4db252ec562b10500da2a1"} Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.986155 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6368a65c678e4cd42d0a4523e4b376e85b69f11faf4db252ec562b10500da2a1" Dec 02 09:30:03 crc kubenswrapper[4895]: I1202 09:30:03.985848 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm" Dec 02 09:30:04 crc kubenswrapper[4895]: I1202 09:30:04.394242 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx"] Dec 02 09:30:04 crc kubenswrapper[4895]: I1202 09:30:04.403134 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-v5dfx"] Dec 02 09:30:05 crc kubenswrapper[4895]: I1202 09:30:05.155932 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6858333d-2201-4f94-a119-c92c9dbf7cce" path="/var/lib/kubelet/pods/6858333d-2201-4f94-a119-c92c9dbf7cce/volumes" Dec 02 09:30:39 crc kubenswrapper[4895]: I1202 09:30:39.239420 4895 scope.go:117] "RemoveContainer" containerID="cd9c72188aebc52ab1dfe2e9f41694f764e98994862c2ae178b4f1b0280854fb" Dec 02 09:30:44 crc kubenswrapper[4895]: I1202 09:30:44.416056 4895 generic.go:334] "Generic (PLEG): container finished" podID="366b5800-e486-4b22-9e3a-4d0c86356cd0" containerID="f1ec2781b08f551a88f72ca01716a2100712fb7a527937e49424e214adb17ae6" exitCode=0 Dec 02 09:30:44 crc kubenswrapper[4895]: I1202 09:30:44.416938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" event={"ID":"366b5800-e486-4b22-9e3a-4d0c86356cd0","Type":"ContainerDied","Data":"f1ec2781b08f551a88f72ca01716a2100712fb7a527937e49424e214adb17ae6"} Dec 02 09:30:45 crc kubenswrapper[4895]: I1202 09:30:45.957681 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.020525 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jmkm\" (UniqueName: \"kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm\") pod \"366b5800-e486-4b22-9e3a-4d0c86356cd0\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.020705 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph\") pod \"366b5800-e486-4b22-9e3a-4d0c86356cd0\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.020804 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key\") pod \"366b5800-e486-4b22-9e3a-4d0c86356cd0\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.020932 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory\") pod \"366b5800-e486-4b22-9e3a-4d0c86356cd0\" (UID: \"366b5800-e486-4b22-9e3a-4d0c86356cd0\") " Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.027161 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm" (OuterVolumeSpecName: "kube-api-access-7jmkm") pod "366b5800-e486-4b22-9e3a-4d0c86356cd0" (UID: "366b5800-e486-4b22-9e3a-4d0c86356cd0"). InnerVolumeSpecName "kube-api-access-7jmkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.027869 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph" (OuterVolumeSpecName: "ceph") pod "366b5800-e486-4b22-9e3a-4d0c86356cd0" (UID: "366b5800-e486-4b22-9e3a-4d0c86356cd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.052319 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory" (OuterVolumeSpecName: "inventory") pod "366b5800-e486-4b22-9e3a-4d0c86356cd0" (UID: "366b5800-e486-4b22-9e3a-4d0c86356cd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.078645 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "366b5800-e486-4b22-9e3a-4d0c86356cd0" (UID: "366b5800-e486-4b22-9e3a-4d0c86356cd0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.123283 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.123328 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.123340 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jmkm\" (UniqueName: \"kubernetes.io/projected/366b5800-e486-4b22-9e3a-4d0c86356cd0-kube-api-access-7jmkm\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.123351 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/366b5800-e486-4b22-9e3a-4d0c86356cd0-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.442499 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" event={"ID":"366b5800-e486-4b22-9e3a-4d0c86356cd0","Type":"ContainerDied","Data":"9041a8431238dfa2a2c23f1a7f118a157760bf1b38d0a4ea6667f51741ad10dc"} Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.442549 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9041a8431238dfa2a2c23f1a7f118a157760bf1b38d0a4ea6667f51741ad10dc" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.442560 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-z7wst" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.530331 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-7vsz6"] Dec 02 09:30:46 crc kubenswrapper[4895]: E1202 09:30:46.530906 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366b5800-e486-4b22-9e3a-4d0c86356cd0" containerName="configure-os-openstack-openstack-cell1" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.530932 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="366b5800-e486-4b22-9e3a-4d0c86356cd0" containerName="configure-os-openstack-openstack-cell1" Dec 02 09:30:46 crc kubenswrapper[4895]: E1202 09:30:46.530982 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecbb554b-7c1f-4475-83ff-8184cc72986b" containerName="collect-profiles" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.530992 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecbb554b-7c1f-4475-83ff-8184cc72986b" containerName="collect-profiles" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.531249 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecbb554b-7c1f-4475-83ff-8184cc72986b" containerName="collect-profiles" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.531294 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="366b5800-e486-4b22-9e3a-4d0c86356cd0" containerName="configure-os-openstack-openstack-cell1" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.532436 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.536835 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.536833 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.537418 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.537705 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.540554 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-7vsz6"] Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.633365 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.633476 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8v9d\" (UniqueName: \"kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.633551 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.634042 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.736766 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8v9d\" (UniqueName: \"kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.736921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.737198 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.737271 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.741446 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.741538 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.743221 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.756504 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8v9d\" (UniqueName: \"kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d\") pod \"ssh-known-hosts-openstack-7vsz6\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:46 crc kubenswrapper[4895]: I1202 09:30:46.857229 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:47 crc kubenswrapper[4895]: I1202 09:30:47.444591 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-7vsz6"] Dec 02 09:30:48 crc kubenswrapper[4895]: I1202 09:30:48.473059 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-7vsz6" event={"ID":"3834bc1f-18a0-4d57-8f0d-e5150bd51186","Type":"ContainerStarted","Data":"d763b8941c3f5cede6798a3a552fbd8b7aa1fbf23d7a3b020502c89deb338a08"} Dec 02 09:30:48 crc kubenswrapper[4895]: I1202 09:30:48.473511 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-7vsz6" event={"ID":"3834bc1f-18a0-4d57-8f0d-e5150bd51186","Type":"ContainerStarted","Data":"a8f36993d389958ed4a0bdcbacf19cb538c7f914598ad21a8f21f6814b27cbe1"} Dec 02 09:30:48 crc kubenswrapper[4895]: I1202 09:30:48.494166 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-7vsz6" podStartSLOduration=2.323794452 podStartE2EDuration="2.494094754s" podCreationTimestamp="2025-12-02 09:30:46 +0000 UTC" firstStartedPulling="2025-12-02 09:30:47.459147161 +0000 UTC m=+7658.630006784" lastFinishedPulling="2025-12-02 09:30:47.629447473 +0000 UTC m=+7658.800307086" observedRunningTime="2025-12-02 09:30:48.493031791 +0000 UTC m=+7659.663891414" watchObservedRunningTime="2025-12-02 09:30:48.494094754 +0000 UTC m=+7659.664954377" Dec 02 09:30:56 crc kubenswrapper[4895]: I1202 09:30:56.550182 4895 generic.go:334] "Generic (PLEG): container finished" podID="3834bc1f-18a0-4d57-8f0d-e5150bd51186" containerID="d763b8941c3f5cede6798a3a552fbd8b7aa1fbf23d7a3b020502c89deb338a08" exitCode=0 Dec 02 09:30:56 crc kubenswrapper[4895]: I1202 09:30:56.550278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-7vsz6" event={"ID":"3834bc1f-18a0-4d57-8f0d-e5150bd51186","Type":"ContainerDied","Data":"d763b8941c3f5cede6798a3a552fbd8b7aa1fbf23d7a3b020502c89deb338a08"} Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.042488 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.211987 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph\") pod \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.212195 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0\") pod \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.212392 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1\") pod \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.212438 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8v9d\" (UniqueName: \"kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d\") pod \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\" (UID: \"3834bc1f-18a0-4d57-8f0d-e5150bd51186\") " Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.218600 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph" (OuterVolumeSpecName: "ceph") pod "3834bc1f-18a0-4d57-8f0d-e5150bd51186" (UID: "3834bc1f-18a0-4d57-8f0d-e5150bd51186"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.218808 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d" (OuterVolumeSpecName: "kube-api-access-b8v9d") pod "3834bc1f-18a0-4d57-8f0d-e5150bd51186" (UID: "3834bc1f-18a0-4d57-8f0d-e5150bd51186"). InnerVolumeSpecName "kube-api-access-b8v9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.252911 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3834bc1f-18a0-4d57-8f0d-e5150bd51186" (UID: "3834bc1f-18a0-4d57-8f0d-e5150bd51186"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.255845 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3834bc1f-18a0-4d57-8f0d-e5150bd51186" (UID: "3834bc1f-18a0-4d57-8f0d-e5150bd51186"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.314634 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.314668 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8v9d\" (UniqueName: \"kubernetes.io/projected/3834bc1f-18a0-4d57-8f0d-e5150bd51186-kube-api-access-b8v9d\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.314679 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.314690 4895 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3834bc1f-18a0-4d57-8f0d-e5150bd51186-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.571414 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-7vsz6" event={"ID":"3834bc1f-18a0-4d57-8f0d-e5150bd51186","Type":"ContainerDied","Data":"a8f36993d389958ed4a0bdcbacf19cb538c7f914598ad21a8f21f6814b27cbe1"} Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.571702 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f36993d389958ed4a0bdcbacf19cb538c7f914598ad21a8f21f6814b27cbe1" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.571510 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-7vsz6" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.645979 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-j9gdl"] Dec 02 09:30:58 crc kubenswrapper[4895]: E1202 09:30:58.646557 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3834bc1f-18a0-4d57-8f0d-e5150bd51186" containerName="ssh-known-hosts-openstack" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.646582 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3834bc1f-18a0-4d57-8f0d-e5150bd51186" containerName="ssh-known-hosts-openstack" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.646982 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3834bc1f-18a0-4d57-8f0d-e5150bd51186" containerName="ssh-known-hosts-openstack" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.648085 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.650476 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.650477 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.650555 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.650656 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.658078 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-j9gdl"] Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.825991 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.826405 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.826479 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whhm2\" (UniqueName: \"kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.826616 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.928535 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.928603 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.928652 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whhm2\" (UniqueName: \"kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.928724 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.934543 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.935428 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.946117 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.950192 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whhm2\" (UniqueName: \"kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2\") pod \"run-os-openstack-openstack-cell1-j9gdl\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:58 crc kubenswrapper[4895]: I1202 09:30:58.965200 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:30:59 crc kubenswrapper[4895]: I1202 09:30:59.546782 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-j9gdl"] Dec 02 09:30:59 crc kubenswrapper[4895]: I1202 09:30:59.585102 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" event={"ID":"afc474d8-721b-479d-a10b-adfa2455b1fb","Type":"ContainerStarted","Data":"b0e205eab87f053e2a5145ec2ef761efa7144a0e04e0c0ee6d6033db7b987c70"} Dec 02 09:31:00 crc kubenswrapper[4895]: I1202 09:31:00.595630 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" event={"ID":"afc474d8-721b-479d-a10b-adfa2455b1fb","Type":"ContainerStarted","Data":"2e57dd4e3ccc65741169f35d975dafe3b515b20b0cb9f2af07e049ad6fc79b63"} Dec 02 09:31:00 crc kubenswrapper[4895]: I1202 09:31:00.621583 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" podStartSLOduration=2.427539387 podStartE2EDuration="2.621556688s" podCreationTimestamp="2025-12-02 09:30:58 +0000 UTC" firstStartedPulling="2025-12-02 09:30:59.550389377 +0000 UTC m=+7670.721249000" lastFinishedPulling="2025-12-02 09:30:59.744406688 +0000 UTC m=+7670.915266301" observedRunningTime="2025-12-02 09:31:00.612116314 +0000 UTC m=+7671.782975937" watchObservedRunningTime="2025-12-02 09:31:00.621556688 +0000 UTC m=+7671.792416301" Dec 02 09:31:08 crc kubenswrapper[4895]: I1202 09:31:08.698398 4895 generic.go:334] "Generic (PLEG): container finished" podID="afc474d8-721b-479d-a10b-adfa2455b1fb" containerID="2e57dd4e3ccc65741169f35d975dafe3b515b20b0cb9f2af07e049ad6fc79b63" exitCode=0 Dec 02 09:31:08 crc kubenswrapper[4895]: I1202 09:31:08.698495 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" event={"ID":"afc474d8-721b-479d-a10b-adfa2455b1fb","Type":"ContainerDied","Data":"2e57dd4e3ccc65741169f35d975dafe3b515b20b0cb9f2af07e049ad6fc79b63"} Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.227948 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.400111 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory\") pod \"afc474d8-721b-479d-a10b-adfa2455b1fb\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.400344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whhm2\" (UniqueName: \"kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2\") pod \"afc474d8-721b-479d-a10b-adfa2455b1fb\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.400382 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key\") pod \"afc474d8-721b-479d-a10b-adfa2455b1fb\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.400428 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph\") pod \"afc474d8-721b-479d-a10b-adfa2455b1fb\" (UID: \"afc474d8-721b-479d-a10b-adfa2455b1fb\") " Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.427814 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2" (OuterVolumeSpecName: "kube-api-access-whhm2") pod "afc474d8-721b-479d-a10b-adfa2455b1fb" (UID: "afc474d8-721b-479d-a10b-adfa2455b1fb"). InnerVolumeSpecName "kube-api-access-whhm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.429889 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph" (OuterVolumeSpecName: "ceph") pod "afc474d8-721b-479d-a10b-adfa2455b1fb" (UID: "afc474d8-721b-479d-a10b-adfa2455b1fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.434244 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "afc474d8-721b-479d-a10b-adfa2455b1fb" (UID: "afc474d8-721b-479d-a10b-adfa2455b1fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.436061 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory" (OuterVolumeSpecName: "inventory") pod "afc474d8-721b-479d-a10b-adfa2455b1fb" (UID: "afc474d8-721b-479d-a10b-adfa2455b1fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.505550 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whhm2\" (UniqueName: \"kubernetes.io/projected/afc474d8-721b-479d-a10b-adfa2455b1fb-kube-api-access-whhm2\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.505588 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.505598 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.505606 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc474d8-721b-479d-a10b-adfa2455b1fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.717330 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" event={"ID":"afc474d8-721b-479d-a10b-adfa2455b1fb","Type":"ContainerDied","Data":"b0e205eab87f053e2a5145ec2ef761efa7144a0e04e0c0ee6d6033db7b987c70"} Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.717372 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e205eab87f053e2a5145ec2ef761efa7144a0e04e0c0ee6d6033db7b987c70" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.717858 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-j9gdl" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.791979 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xj8ll"] Dec 02 09:31:10 crc kubenswrapper[4895]: E1202 09:31:10.792703 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc474d8-721b-479d-a10b-adfa2455b1fb" containerName="run-os-openstack-openstack-cell1" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.792809 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc474d8-721b-479d-a10b-adfa2455b1fb" containerName="run-os-openstack-openstack-cell1" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.793067 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc474d8-721b-479d-a10b-adfa2455b1fb" containerName="run-os-openstack-openstack-cell1" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.793925 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.795988 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.796090 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.796855 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.797156 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.806798 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xj8ll"] Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.937179 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttg9\" (UniqueName: \"kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.937284 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.937383 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:10 crc kubenswrapper[4895]: I1202 09:31:10.937525 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.041643 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.042156 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttg9\" (UniqueName: \"kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.042340 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.042457 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.046730 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.049530 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.049872 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.065686 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttg9\" (UniqueName: \"kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9\") pod \"reboot-os-openstack-openstack-cell1-xj8ll\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.159601 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.690055 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-xj8ll"] Dec 02 09:31:11 crc kubenswrapper[4895]: I1202 09:31:11.728023 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" event={"ID":"fa246d81-1464-4069-9a3a-40b53b72e55f","Type":"ContainerStarted","Data":"caf0e86efcd1ce5eaabd1cd49210de30e7f44d99a0c2b381fccb48c926a5f21b"} Dec 02 09:31:12 crc kubenswrapper[4895]: I1202 09:31:12.739321 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" event={"ID":"fa246d81-1464-4069-9a3a-40b53b72e55f","Type":"ContainerStarted","Data":"da60187862046771e8118ed386ee1a4bd1395f006662a8ffecaecc725195f707"} Dec 02 09:31:12 crc kubenswrapper[4895]: I1202 09:31:12.764232 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" podStartSLOduration=2.570835954 podStartE2EDuration="2.764205274s" podCreationTimestamp="2025-12-02 09:31:10 +0000 UTC" firstStartedPulling="2025-12-02 09:31:11.69257187 +0000 UTC m=+7682.863431483" lastFinishedPulling="2025-12-02 09:31:11.8859412 +0000 UTC m=+7683.056800803" observedRunningTime="2025-12-02 09:31:12.756757423 +0000 UTC m=+7683.927617046" watchObservedRunningTime="2025-12-02 09:31:12.764205274 +0000 UTC m=+7683.935064887" Dec 02 09:31:27 crc kubenswrapper[4895]: I1202 09:31:27.886504 4895 generic.go:334] "Generic (PLEG): container finished" podID="fa246d81-1464-4069-9a3a-40b53b72e55f" containerID="da60187862046771e8118ed386ee1a4bd1395f006662a8ffecaecc725195f707" exitCode=0 Dec 02 09:31:27 crc kubenswrapper[4895]: I1202 09:31:27.886663 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" event={"ID":"fa246d81-1464-4069-9a3a-40b53b72e55f","Type":"ContainerDied","Data":"da60187862046771e8118ed386ee1a4bd1395f006662a8ffecaecc725195f707"} Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.492334 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.552104 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ttg9\" (UniqueName: \"kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9\") pod \"fa246d81-1464-4069-9a3a-40b53b72e55f\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.552291 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph\") pod \"fa246d81-1464-4069-9a3a-40b53b72e55f\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.552677 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key\") pod \"fa246d81-1464-4069-9a3a-40b53b72e55f\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.552825 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory\") pod \"fa246d81-1464-4069-9a3a-40b53b72e55f\" (UID: \"fa246d81-1464-4069-9a3a-40b53b72e55f\") " Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.559254 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph" (OuterVolumeSpecName: "ceph") pod "fa246d81-1464-4069-9a3a-40b53b72e55f" (UID: "fa246d81-1464-4069-9a3a-40b53b72e55f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.561114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9" (OuterVolumeSpecName: "kube-api-access-6ttg9") pod "fa246d81-1464-4069-9a3a-40b53b72e55f" (UID: "fa246d81-1464-4069-9a3a-40b53b72e55f"). InnerVolumeSpecName "kube-api-access-6ttg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.589017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa246d81-1464-4069-9a3a-40b53b72e55f" (UID: "fa246d81-1464-4069-9a3a-40b53b72e55f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.607101 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory" (OuterVolumeSpecName: "inventory") pod "fa246d81-1464-4069-9a3a-40b53b72e55f" (UID: "fa246d81-1464-4069-9a3a-40b53b72e55f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.656332 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.656374 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.656386 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa246d81-1464-4069-9a3a-40b53b72e55f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.656399 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ttg9\" (UniqueName: \"kubernetes.io/projected/fa246d81-1464-4069-9a3a-40b53b72e55f-kube-api-access-6ttg9\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.910092 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" event={"ID":"fa246d81-1464-4069-9a3a-40b53b72e55f","Type":"ContainerDied","Data":"caf0e86efcd1ce5eaabd1cd49210de30e7f44d99a0c2b381fccb48c926a5f21b"} Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.910140 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf0e86efcd1ce5eaabd1cd49210de30e7f44d99a0c2b381fccb48c926a5f21b" Dec 02 09:31:29 crc kubenswrapper[4895]: I1202 09:31:29.910182 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-xj8ll" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.010858 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vvp2m"] Dec 02 09:31:30 crc kubenswrapper[4895]: E1202 09:31:30.011351 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa246d81-1464-4069-9a3a-40b53b72e55f" containerName="reboot-os-openstack-openstack-cell1" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.011369 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa246d81-1464-4069-9a3a-40b53b72e55f" containerName="reboot-os-openstack-openstack-cell1" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.011589 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa246d81-1464-4069-9a3a-40b53b72e55f" containerName="reboot-os-openstack-openstack-cell1" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.012601 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.016828 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.017844 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.018197 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.018706 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.023399 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vvp2m"] Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.074838 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075135 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075221 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075364 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gg9\" (UniqueName: \"kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075477 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075631 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075794 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.075908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.076072 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.076177 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.076234 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.076260 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.178700 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.179244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.179386 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.179536 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.180367 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.180554 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.180683 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.180858 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gg9\" (UniqueName: \"kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.180994 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.181135 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.181265 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.181395 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.183942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.184335 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.184521 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.185392 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.186589 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.186898 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.187510 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.187656 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.188048 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.188303 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.189564 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.197508 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gg9\" (UniqueName: \"kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9\") pod \"install-certs-openstack-openstack-cell1-vvp2m\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.339215 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:30 crc kubenswrapper[4895]: W1202 09:31:30.855187 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5c7124_dd91_4ff2_ada6_43bfd65dc9f5.slice/crio-dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8 WatchSource:0}: Error finding container dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8: Status 404 returned error can't find the container with id dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8 Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.856764 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-vvp2m"] Dec 02 09:31:30 crc kubenswrapper[4895]: I1202 09:31:30.920420 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" event={"ID":"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5","Type":"ContainerStarted","Data":"dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8"} Dec 02 09:31:31 crc kubenswrapper[4895]: I1202 09:31:31.930384 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" event={"ID":"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5","Type":"ContainerStarted","Data":"812855b97de98cd0bfcd6d65d13551154a1702ad24e3d31e35fe55c2101f0834"} Dec 02 09:31:31 crc kubenswrapper[4895]: I1202 09:31:31.952055 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" podStartSLOduration=2.783725289 podStartE2EDuration="2.952030299s" podCreationTimestamp="2025-12-02 09:31:29 +0000 UTC" firstStartedPulling="2025-12-02 09:31:30.858255115 +0000 UTC m=+7702.029114728" lastFinishedPulling="2025-12-02 09:31:31.026560125 +0000 UTC m=+7702.197419738" observedRunningTime="2025-12-02 09:31:31.945169136 +0000 UTC m=+7703.116028749" watchObservedRunningTime="2025-12-02 09:31:31.952030299 +0000 UTC m=+7703.122889932" Dec 02 09:31:35 crc kubenswrapper[4895]: I1202 09:31:35.473136 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:31:35 crc kubenswrapper[4895]: I1202 09:31:35.473732 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:31:50 crc kubenswrapper[4895]: I1202 09:31:50.191669 4895 generic.go:334] "Generic (PLEG): container finished" podID="7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" containerID="812855b97de98cd0bfcd6d65d13551154a1702ad24e3d31e35fe55c2101f0834" exitCode=0 Dec 02 09:31:50 crc kubenswrapper[4895]: I1202 09:31:50.191726 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" event={"ID":"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5","Type":"ContainerDied","Data":"812855b97de98cd0bfcd6d65d13551154a1702ad24e3d31e35fe55c2101f0834"} Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.643632 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669122 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669244 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669302 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669579 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669714 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669809 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669884 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8gg9\" (UniqueName: \"kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.669989 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.670100 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.670195 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.670255 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.670926 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle\") pod \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\" (UID: \"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5\") " Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.677053 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.683574 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.684247 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.684573 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.684727 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.685253 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.686243 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.688457 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9" (OuterVolumeSpecName: "kube-api-access-l8gg9") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "kube-api-access-l8gg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.689824 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph" (OuterVolumeSpecName: "ceph") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.694971 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.711976 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory" (OuterVolumeSpecName: "inventory") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.713488 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" (UID: "7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773101 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773152 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773172 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8gg9\" (UniqueName: \"kubernetes.io/projected/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-kube-api-access-l8gg9\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773188 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773202 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773429 4895 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773442 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773450 4895 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773485 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773495 4895 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773506 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:51 crc kubenswrapper[4895]: I1202 09:31:51.773518 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.214527 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" event={"ID":"7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5","Type":"ContainerDied","Data":"dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8"} Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.214584 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcaa14baa2cd0d0dcc6355ea842ccd7894987f950b118e1ee45843ec739beff8" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.214594 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-vvp2m" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.305697 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2qtd2"] Dec 02 09:31:52 crc kubenswrapper[4895]: E1202 09:31:52.306904 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" containerName="install-certs-openstack-openstack-cell1" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.307004 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" containerName="install-certs-openstack-openstack-cell1" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.307503 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5" containerName="install-certs-openstack-openstack-cell1" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.317395 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.326171 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2qtd2"] Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.348627 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.348886 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.348938 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.349054 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.390158 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.390292 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f5cl\" (UniqueName: \"kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.390359 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.390412 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.492642 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.493110 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.493238 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f5cl\" (UniqueName: \"kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.493317 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.500047 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.509188 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.512060 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f5cl\" (UniqueName: \"kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.514650 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph\") pod \"ceph-client-openstack-openstack-cell1-2qtd2\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:52 crc kubenswrapper[4895]: I1202 09:31:52.689044 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:31:53 crc kubenswrapper[4895]: I1202 09:31:53.268883 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-2qtd2"] Dec 02 09:31:54 crc kubenswrapper[4895]: I1202 09:31:54.237876 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" event={"ID":"64c58f3c-deb5-4931-9285-f02a3f576dd0","Type":"ContainerStarted","Data":"6795e4f8e9adfbdb6166b83eb904d252fe096fc2a2de1f853b35135a18cc19ba"} Dec 02 09:31:54 crc kubenswrapper[4895]: I1202 09:31:54.238596 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" event={"ID":"64c58f3c-deb5-4931-9285-f02a3f576dd0","Type":"ContainerStarted","Data":"e0b42bcca90e0122f46986c9aaa6704ae0db490c0263d3ec19e247ccdec05b13"} Dec 02 09:31:54 crc kubenswrapper[4895]: I1202 09:31:54.272589 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" podStartSLOduration=2.04770138 podStartE2EDuration="2.272569301s" podCreationTimestamp="2025-12-02 09:31:52 +0000 UTC" firstStartedPulling="2025-12-02 09:31:53.291432063 +0000 UTC m=+7724.462291676" lastFinishedPulling="2025-12-02 09:31:53.516299984 +0000 UTC m=+7724.687159597" observedRunningTime="2025-12-02 09:31:54.267785521 +0000 UTC m=+7725.438645194" watchObservedRunningTime="2025-12-02 09:31:54.272569301 +0000 UTC m=+7725.443428914" Dec 02 09:31:59 crc kubenswrapper[4895]: I1202 09:31:59.332791 4895 generic.go:334] "Generic (PLEG): container finished" podID="64c58f3c-deb5-4931-9285-f02a3f576dd0" containerID="6795e4f8e9adfbdb6166b83eb904d252fe096fc2a2de1f853b35135a18cc19ba" exitCode=0 Dec 02 09:31:59 crc kubenswrapper[4895]: I1202 09:31:59.333425 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" event={"ID":"64c58f3c-deb5-4931-9285-f02a3f576dd0","Type":"ContainerDied","Data":"6795e4f8e9adfbdb6166b83eb904d252fe096fc2a2de1f853b35135a18cc19ba"} Dec 02 09:32:00 crc kubenswrapper[4895]: I1202 09:32:00.977277 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.135908 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key\") pod \"64c58f3c-deb5-4931-9285-f02a3f576dd0\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.136011 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph\") pod \"64c58f3c-deb5-4931-9285-f02a3f576dd0\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.136077 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory\") pod \"64c58f3c-deb5-4931-9285-f02a3f576dd0\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.136334 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f5cl\" (UniqueName: \"kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl\") pod \"64c58f3c-deb5-4931-9285-f02a3f576dd0\" (UID: \"64c58f3c-deb5-4931-9285-f02a3f576dd0\") " Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.143241 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph" (OuterVolumeSpecName: "ceph") pod "64c58f3c-deb5-4931-9285-f02a3f576dd0" (UID: "64c58f3c-deb5-4931-9285-f02a3f576dd0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.143585 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl" (OuterVolumeSpecName: "kube-api-access-5f5cl") pod "64c58f3c-deb5-4931-9285-f02a3f576dd0" (UID: "64c58f3c-deb5-4931-9285-f02a3f576dd0"). InnerVolumeSpecName "kube-api-access-5f5cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.183920 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory" (OuterVolumeSpecName: "inventory") pod "64c58f3c-deb5-4931-9285-f02a3f576dd0" (UID: "64c58f3c-deb5-4931-9285-f02a3f576dd0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.195115 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "64c58f3c-deb5-4931-9285-f02a3f576dd0" (UID: "64c58f3c-deb5-4931-9285-f02a3f576dd0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.240445 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f5cl\" (UniqueName: \"kubernetes.io/projected/64c58f3c-deb5-4931-9285-f02a3f576dd0-kube-api-access-5f5cl\") on node \"crc\" DevicePath \"\"" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.240483 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.240493 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.240501 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64c58f3c-deb5-4931-9285-f02a3f576dd0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.358397 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" event={"ID":"64c58f3c-deb5-4931-9285-f02a3f576dd0","Type":"ContainerDied","Data":"e0b42bcca90e0122f46986c9aaa6704ae0db490c0263d3ec19e247ccdec05b13"} Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.358719 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0b42bcca90e0122f46986c9aaa6704ae0db490c0263d3ec19e247ccdec05b13" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.358545 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-2qtd2" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.439929 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7tr28"] Dec 02 09:32:01 crc kubenswrapper[4895]: E1202 09:32:01.440426 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c58f3c-deb5-4931-9285-f02a3f576dd0" containerName="ceph-client-openstack-openstack-cell1" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.440444 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c58f3c-deb5-4931-9285-f02a3f576dd0" containerName="ceph-client-openstack-openstack-cell1" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.440657 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c58f3c-deb5-4931-9285-f02a3f576dd0" containerName="ceph-client-openstack-openstack-cell1" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.441399 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.451575 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.451680 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.451870 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.452648 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.453315 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.468452 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7tr28"] Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.573925 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.574292 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2dbz\" (UniqueName: \"kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.574424 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.574559 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.574649 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.574718 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675615 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dbz\" (UniqueName: \"kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675688 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675794 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675874 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.675911 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.676906 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.682540 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.684432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.687157 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.688303 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.696201 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dbz\" (UniqueName: \"kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz\") pod \"ovn-openstack-openstack-cell1-7tr28\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:01 crc kubenswrapper[4895]: I1202 09:32:01.763474 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:32:02 crc kubenswrapper[4895]: I1202 09:32:02.327307 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7tr28"] Dec 02 09:32:02 crc kubenswrapper[4895]: I1202 09:32:02.368843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7tr28" event={"ID":"ea628598-f396-4f21-b672-9779a9b04dd1","Type":"ContainerStarted","Data":"5b9894e2d5887ced93f6fd4692723aa8fc35e8dfad0cc27d49e6709a855fb3d0"} Dec 02 09:32:03 crc kubenswrapper[4895]: I1202 09:32:03.386802 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7tr28" event={"ID":"ea628598-f396-4f21-b672-9779a9b04dd1","Type":"ContainerStarted","Data":"69314594f814bead00b2cdcda8d975c304441b5c4f3983cbd30570f971047cb2"} Dec 02 09:32:03 crc kubenswrapper[4895]: I1202 09:32:03.412309 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-7tr28" podStartSLOduration=2.010214123 podStartE2EDuration="2.412275311s" podCreationTimestamp="2025-12-02 09:32:01 +0000 UTC" firstStartedPulling="2025-12-02 09:32:02.336783056 +0000 UTC m=+7733.507642669" lastFinishedPulling="2025-12-02 09:32:02.738844244 +0000 UTC m=+7733.909703857" observedRunningTime="2025-12-02 09:32:03.410564718 +0000 UTC m=+7734.581424331" watchObservedRunningTime="2025-12-02 09:32:03.412275311 +0000 UTC m=+7734.583134924" Dec 02 09:32:05 crc kubenswrapper[4895]: I1202 09:32:05.473053 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:32:05 crc kubenswrapper[4895]: I1202 09:32:05.473574 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.473620 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.474264 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.474310 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.475242 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.475390 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" gracePeriod=600 Dec 02 09:32:35 crc kubenswrapper[4895]: E1202 09:32:35.598331 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.905853 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" exitCode=0 Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.905906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b"} Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.905957 4895 scope.go:117] "RemoveContainer" containerID="3cd7773ccf4b21f0075e975c1552444f3a74a56b8e22a60f1d2dd8aa7481d21b" Dec 02 09:32:35 crc kubenswrapper[4895]: I1202 09:32:35.906933 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:32:35 crc kubenswrapper[4895]: E1202 09:32:35.907315 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:32:47 crc kubenswrapper[4895]: I1202 09:32:47.141849 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:32:47 crc kubenswrapper[4895]: E1202 09:32:47.144074 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:00 crc kubenswrapper[4895]: I1202 09:33:00.141009 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:33:00 crc kubenswrapper[4895]: E1202 09:33:00.143150 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:12 crc kubenswrapper[4895]: I1202 09:33:12.365072 4895 generic.go:334] "Generic (PLEG): container finished" podID="ea628598-f396-4f21-b672-9779a9b04dd1" containerID="69314594f814bead00b2cdcda8d975c304441b5c4f3983cbd30570f971047cb2" exitCode=0 Dec 02 09:33:12 crc kubenswrapper[4895]: I1202 09:33:12.365140 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7tr28" event={"ID":"ea628598-f396-4f21-b672-9779a9b04dd1","Type":"ContainerDied","Data":"69314594f814bead00b2cdcda8d975c304441b5c4f3983cbd30570f971047cb2"} Dec 02 09:33:13 crc kubenswrapper[4895]: I1202 09:33:13.141219 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:33:13 crc kubenswrapper[4895]: E1202 09:33:13.141949 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:13 crc kubenswrapper[4895]: I1202 09:33:13.925633 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108183 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2dbz\" (UniqueName: \"kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108237 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108276 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108388 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108494 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.108639 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0\") pod \"ea628598-f396-4f21-b672-9779a9b04dd1\" (UID: \"ea628598-f396-4f21-b672-9779a9b04dd1\") " Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.116699 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.116710 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz" (OuterVolumeSpecName: "kube-api-access-s2dbz") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "kube-api-access-s2dbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.116779 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph" (OuterVolumeSpecName: "ceph") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.143570 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.144784 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory" (OuterVolumeSpecName: "inventory") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.144828 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea628598-f396-4f21-b672-9779a9b04dd1" (UID: "ea628598-f396-4f21-b672-9779a9b04dd1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212288 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212328 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212339 4895 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ea628598-f396-4f21-b672-9779a9b04dd1-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212351 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2dbz\" (UniqueName: \"kubernetes.io/projected/ea628598-f396-4f21-b672-9779a9b04dd1-kube-api-access-s2dbz\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212359 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.212367 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea628598-f396-4f21-b672-9779a9b04dd1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.396876 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7tr28" event={"ID":"ea628598-f396-4f21-b672-9779a9b04dd1","Type":"ContainerDied","Data":"5b9894e2d5887ced93f6fd4692723aa8fc35e8dfad0cc27d49e6709a855fb3d0"} Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.396926 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7tr28" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.396943 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b9894e2d5887ced93f6fd4692723aa8fc35e8dfad0cc27d49e6709a855fb3d0" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.490225 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dtfpx"] Dec 02 09:33:14 crc kubenswrapper[4895]: E1202 09:33:14.491205 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea628598-f396-4f21-b672-9779a9b04dd1" containerName="ovn-openstack-openstack-cell1" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.491236 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea628598-f396-4f21-b672-9779a9b04dd1" containerName="ovn-openstack-openstack-cell1" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.491678 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea628598-f396-4f21-b672-9779a9b04dd1" containerName="ovn-openstack-openstack-cell1" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.494575 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.498921 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.499170 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.499677 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.500540 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.500713 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.503021 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.506315 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dtfpx"] Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627115 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5l2s\" (UniqueName: \"kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627165 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627207 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627374 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627595 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627790 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.627865 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.731033 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5l2s\" (UniqueName: \"kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.731937 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.732831 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.732921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.732957 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.732985 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.733011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.739100 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.739290 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.739582 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.740839 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.741109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.741537 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.750198 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5l2s\" (UniqueName: \"kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s\") pod \"neutron-metadata-openstack-openstack-cell1-dtfpx\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:14 crc kubenswrapper[4895]: I1202 09:33:14.849183 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:33:15 crc kubenswrapper[4895]: I1202 09:33:15.599020 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dtfpx"] Dec 02 09:33:16 crc kubenswrapper[4895]: I1202 09:33:16.419321 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" event={"ID":"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4","Type":"ContainerStarted","Data":"2b3e5e08e1be277ac7d9b94d86b287561179749da9dbde5190bc9b0ce20770ee"} Dec 02 09:33:16 crc kubenswrapper[4895]: I1202 09:33:16.419828 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" event={"ID":"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4","Type":"ContainerStarted","Data":"3f7f1a5ff44144a7f613ff32c28f11e6b6a97e67e30896d2270787c2679163af"} Dec 02 09:33:16 crc kubenswrapper[4895]: I1202 09:33:16.548945 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" podStartSLOduration=2.3918963939999998 podStartE2EDuration="2.548906682s" podCreationTimestamp="2025-12-02 09:33:14 +0000 UTC" firstStartedPulling="2025-12-02 09:33:15.604004502 +0000 UTC m=+7806.774864115" lastFinishedPulling="2025-12-02 09:33:15.76101479 +0000 UTC m=+7806.931874403" observedRunningTime="2025-12-02 09:33:16.439949219 +0000 UTC m=+7807.610808872" watchObservedRunningTime="2025-12-02 09:33:16.548906682 +0000 UTC m=+7807.719766295" Dec 02 09:33:25 crc kubenswrapper[4895]: I1202 09:33:25.140782 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:33:25 crc kubenswrapper[4895]: E1202 09:33:25.142041 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.011049 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.015141 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.030424 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.146056 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.146189 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.146383 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grkmk\" (UniqueName: \"kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.248633 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.248921 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grkmk\" (UniqueName: \"kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.248981 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.249784 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.249972 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.272296 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grkmk\" (UniqueName: \"kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk\") pod \"community-operators-f2swm\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:33 crc kubenswrapper[4895]: I1202 09:33:33.361990 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:34 crc kubenswrapper[4895]: I1202 09:33:34.104838 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:34 crc kubenswrapper[4895]: I1202 09:33:34.661316 4895 generic.go:334] "Generic (PLEG): container finished" podID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerID="9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a" exitCode=0 Dec 02 09:33:34 crc kubenswrapper[4895]: I1202 09:33:34.661434 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerDied","Data":"9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a"} Dec 02 09:33:34 crc kubenswrapper[4895]: I1202 09:33:34.661639 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerStarted","Data":"14ae416ccf8e47442b9fe3487bbb81b307089440927f9f0406e096ed15132fc9"} Dec 02 09:33:36 crc kubenswrapper[4895]: I1202 09:33:36.140761 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:33:36 crc kubenswrapper[4895]: E1202 09:33:36.141532 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:36 crc kubenswrapper[4895]: I1202 09:33:36.686830 4895 generic.go:334] "Generic (PLEG): container finished" podID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerID="5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb" exitCode=0 Dec 02 09:33:36 crc kubenswrapper[4895]: I1202 09:33:36.686888 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerDied","Data":"5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb"} Dec 02 09:33:37 crc kubenswrapper[4895]: I1202 09:33:37.699128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerStarted","Data":"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb"} Dec 02 09:33:37 crc kubenswrapper[4895]: I1202 09:33:37.727011 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f2swm" podStartSLOduration=3.1906323739999998 podStartE2EDuration="5.726987362s" podCreationTimestamp="2025-12-02 09:33:32 +0000 UTC" firstStartedPulling="2025-12-02 09:33:34.66345063 +0000 UTC m=+7825.834310243" lastFinishedPulling="2025-12-02 09:33:37.199805618 +0000 UTC m=+7828.370665231" observedRunningTime="2025-12-02 09:33:37.723660418 +0000 UTC m=+7828.894520051" watchObservedRunningTime="2025-12-02 09:33:37.726987362 +0000 UTC m=+7828.897846975" Dec 02 09:33:43 crc kubenswrapper[4895]: I1202 09:33:43.362447 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:43 crc kubenswrapper[4895]: I1202 09:33:43.366831 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:43 crc kubenswrapper[4895]: I1202 09:33:43.415932 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:43 crc kubenswrapper[4895]: I1202 09:33:43.822067 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:43 crc kubenswrapper[4895]: I1202 09:33:43.882433 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:45 crc kubenswrapper[4895]: I1202 09:33:45.780247 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f2swm" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="registry-server" containerID="cri-o://21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb" gracePeriod=2 Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.256101 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.360252 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grkmk\" (UniqueName: \"kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk\") pod \"903e8499-531c-4f6d-bfe8-db635ab2aa55\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.360317 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities\") pod \"903e8499-531c-4f6d-bfe8-db635ab2aa55\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.360671 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content\") pod \"903e8499-531c-4f6d-bfe8-db635ab2aa55\" (UID: \"903e8499-531c-4f6d-bfe8-db635ab2aa55\") " Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.361639 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities" (OuterVolumeSpecName: "utilities") pod "903e8499-531c-4f6d-bfe8-db635ab2aa55" (UID: "903e8499-531c-4f6d-bfe8-db635ab2aa55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.366865 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk" (OuterVolumeSpecName: "kube-api-access-grkmk") pod "903e8499-531c-4f6d-bfe8-db635ab2aa55" (UID: "903e8499-531c-4f6d-bfe8-db635ab2aa55"). InnerVolumeSpecName "kube-api-access-grkmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.431613 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "903e8499-531c-4f6d-bfe8-db635ab2aa55" (UID: "903e8499-531c-4f6d-bfe8-db635ab2aa55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.463136 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grkmk\" (UniqueName: \"kubernetes.io/projected/903e8499-531c-4f6d-bfe8-db635ab2aa55-kube-api-access-grkmk\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.463175 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.463185 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/903e8499-531c-4f6d-bfe8-db635ab2aa55-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.792794 4895 generic.go:334] "Generic (PLEG): container finished" podID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerID="21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb" exitCode=0 Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.792835 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerDied","Data":"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb"} Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.792855 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2swm" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.792876 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2swm" event={"ID":"903e8499-531c-4f6d-bfe8-db635ab2aa55","Type":"ContainerDied","Data":"14ae416ccf8e47442b9fe3487bbb81b307089440927f9f0406e096ed15132fc9"} Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.792897 4895 scope.go:117] "RemoveContainer" containerID="21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.822412 4895 scope.go:117] "RemoveContainer" containerID="5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.833604 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.843684 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f2swm"] Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.856271 4895 scope.go:117] "RemoveContainer" containerID="9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.894570 4895 scope.go:117] "RemoveContainer" containerID="21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb" Dec 02 09:33:46 crc kubenswrapper[4895]: E1202 09:33:46.895258 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb\": container with ID starting with 21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb not found: ID does not exist" containerID="21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.895292 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb"} err="failed to get container status \"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb\": rpc error: code = NotFound desc = could not find container \"21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb\": container with ID starting with 21eb1570dff4adc09c4543676b23cb873ff118e2bc51a81dab815b79de7d53eb not found: ID does not exist" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.895317 4895 scope.go:117] "RemoveContainer" containerID="5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb" Dec 02 09:33:46 crc kubenswrapper[4895]: E1202 09:33:46.895612 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb\": container with ID starting with 5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb not found: ID does not exist" containerID="5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.895647 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb"} err="failed to get container status \"5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb\": rpc error: code = NotFound desc = could not find container \"5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb\": container with ID starting with 5c7146d9006733a105fc51dcd56a5fba8ae6f5864cfccb94424ae85e465b78eb not found: ID does not exist" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.895671 4895 scope.go:117] "RemoveContainer" containerID="9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a" Dec 02 09:33:46 crc kubenswrapper[4895]: E1202 09:33:46.896057 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a\": container with ID starting with 9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a not found: ID does not exist" containerID="9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a" Dec 02 09:33:46 crc kubenswrapper[4895]: I1202 09:33:46.896078 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a"} err="failed to get container status \"9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a\": rpc error: code = NotFound desc = could not find container \"9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a\": container with ID starting with 9349a03cf80ec8941a034fa21e4fc923354b6b81c2f56d10de51f2b033909e4a not found: ID does not exist" Dec 02 09:33:47 crc kubenswrapper[4895]: I1202 09:33:47.152132 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" path="/var/lib/kubelet/pods/903e8499-531c-4f6d-bfe8-db635ab2aa55/volumes" Dec 02 09:33:49 crc kubenswrapper[4895]: I1202 09:33:49.152294 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:33:49 crc kubenswrapper[4895]: E1202 09:33:49.153189 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.406840 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:33:57 crc kubenswrapper[4895]: E1202 09:33:57.408012 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="extract-content" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.408031 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="extract-content" Dec 02 09:33:57 crc kubenswrapper[4895]: E1202 09:33:57.408060 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="registry-server" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.408068 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="registry-server" Dec 02 09:33:57 crc kubenswrapper[4895]: E1202 09:33:57.408084 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="extract-utilities" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.408093 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="extract-utilities" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.408360 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="903e8499-531c-4f6d-bfe8-db635ab2aa55" containerName="registry-server" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.410066 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.428870 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.532587 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.532852 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrst\" (UniqueName: \"kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.532934 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.635517 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.635716 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrst\" (UniqueName: \"kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.635759 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.636421 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.636669 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.669725 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrst\" (UniqueName: \"kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst\") pod \"redhat-operators-w9hm8\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:57 crc kubenswrapper[4895]: I1202 09:33:57.778126 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:33:58 crc kubenswrapper[4895]: I1202 09:33:58.317281 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:33:58 crc kubenswrapper[4895]: E1202 09:33:58.786185 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18211ca2_5d6e_4afc_9b4e_888c21ee355b.slice/crio-conmon-badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18211ca2_5d6e_4afc_9b4e_888c21ee355b.slice/crio-badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b.scope\": RecentStats: unable to find data in memory cache]" Dec 02 09:33:58 crc kubenswrapper[4895]: I1202 09:33:58.944932 4895 generic.go:334] "Generic (PLEG): container finished" podID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerID="badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b" exitCode=0 Dec 02 09:33:58 crc kubenswrapper[4895]: I1202 09:33:58.944980 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerDied","Data":"badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b"} Dec 02 09:33:58 crc kubenswrapper[4895]: I1202 09:33:58.945005 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerStarted","Data":"30177a9b689de6d7185ce082451600b591769da3f41649bbf9ce69d72309771d"} Dec 02 09:33:58 crc kubenswrapper[4895]: I1202 09:33:58.947548 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:33:59 crc kubenswrapper[4895]: I1202 09:33:59.958008 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerStarted","Data":"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8"} Dec 02 09:34:00 crc kubenswrapper[4895]: I1202 09:34:00.142182 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:34:00 crc kubenswrapper[4895]: E1202 09:34:00.142725 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:34:02 crc kubenswrapper[4895]: I1202 09:34:02.988671 4895 generic.go:334] "Generic (PLEG): container finished" podID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerID="5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8" exitCode=0 Dec 02 09:34:02 crc kubenswrapper[4895]: I1202 09:34:02.988730 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerDied","Data":"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8"} Dec 02 09:34:04 crc kubenswrapper[4895]: I1202 09:34:04.002196 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerStarted","Data":"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1"} Dec 02 09:34:04 crc kubenswrapper[4895]: I1202 09:34:04.028225 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w9hm8" podStartSLOduration=2.48182425 podStartE2EDuration="7.028195729s" podCreationTimestamp="2025-12-02 09:33:57 +0000 UTC" firstStartedPulling="2025-12-02 09:33:58.947308158 +0000 UTC m=+7850.118167761" lastFinishedPulling="2025-12-02 09:34:03.493679617 +0000 UTC m=+7854.664539240" observedRunningTime="2025-12-02 09:34:04.022896434 +0000 UTC m=+7855.193756087" watchObservedRunningTime="2025-12-02 09:34:04.028195729 +0000 UTC m=+7855.199055342" Dec 02 09:34:07 crc kubenswrapper[4895]: I1202 09:34:07.778593 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:07 crc kubenswrapper[4895]: I1202 09:34:07.779418 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:08 crc kubenswrapper[4895]: I1202 09:34:08.836057 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w9hm8" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="registry-server" probeResult="failure" output=< Dec 02 09:34:08 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:34:08 crc kubenswrapper[4895]: > Dec 02 09:34:13 crc kubenswrapper[4895]: I1202 09:34:13.140885 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:34:13 crc kubenswrapper[4895]: E1202 09:34:13.141570 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:34:14 crc kubenswrapper[4895]: I1202 09:34:14.123491 4895 generic.go:334] "Generic (PLEG): container finished" podID="5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" containerID="2b3e5e08e1be277ac7d9b94d86b287561179749da9dbde5190bc9b0ce20770ee" exitCode=0 Dec 02 09:34:14 crc kubenswrapper[4895]: I1202 09:34:14.123549 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" event={"ID":"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4","Type":"ContainerDied","Data":"2b3e5e08e1be277ac7d9b94d86b287561179749da9dbde5190bc9b0ce20770ee"} Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.665598 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814528 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814588 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814618 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814704 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5l2s\" (UniqueName: \"kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814786 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.814911 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0\") pod \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\" (UID: \"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4\") " Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.820679 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.821049 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s" (OuterVolumeSpecName: "kube-api-access-z5l2s") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "kube-api-access-z5l2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.827900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph" (OuterVolumeSpecName: "ceph") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.848791 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.848968 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory" (OuterVolumeSpecName: "inventory") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.849033 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.859759 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" (UID: "5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919419 4895 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919460 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919474 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919493 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919506 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5l2s\" (UniqueName: \"kubernetes.io/projected/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-kube-api-access-z5l2s\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919524 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:15 crc kubenswrapper[4895]: I1202 09:34:15.919533 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.146605 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" event={"ID":"5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4","Type":"ContainerDied","Data":"3f7f1a5ff44144a7f613ff32c28f11e6b6a97e67e30896d2270787c2679163af"} Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.146648 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dtfpx" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.146657 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7f1a5ff44144a7f613ff32c28f11e6b6a97e67e30896d2270787c2679163af" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.245488 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kd4nz"] Dec 02 09:34:16 crc kubenswrapper[4895]: E1202 09:34:16.245933 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" containerName="neutron-metadata-openstack-openstack-cell1" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.245952 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" containerName="neutron-metadata-openstack-openstack-cell1" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.246216 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4" containerName="neutron-metadata-openstack-openstack-cell1" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.247058 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.249946 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.250094 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.250281 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.249971 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.250339 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.268719 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kd4nz"] Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.437285 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.437669 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhz9b\" (UniqueName: \"kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.437786 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.438031 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.438298 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.438393 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.541657 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.541792 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.541831 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.541885 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.541927 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhz9b\" (UniqueName: \"kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.542010 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.547126 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.547639 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.549631 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.550088 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.553432 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.561445 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhz9b\" (UniqueName: \"kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b\") pod \"libvirt-openstack-openstack-cell1-kd4nz\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:16 crc kubenswrapper[4895]: I1202 09:34:16.568076 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:34:17 crc kubenswrapper[4895]: I1202 09:34:17.175516 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kd4nz"] Dec 02 09:34:17 crc kubenswrapper[4895]: W1202 09:34:17.176781 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b84d8a6_8098_46cd_83cf_860f21f040a0.slice/crio-bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a WatchSource:0}: Error finding container bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a: Status 404 returned error can't find the container with id bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a Dec 02 09:34:17 crc kubenswrapper[4895]: I1202 09:34:17.846682 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:17 crc kubenswrapper[4895]: I1202 09:34:17.907440 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:18 crc kubenswrapper[4895]: I1202 09:34:18.091192 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:34:18 crc kubenswrapper[4895]: I1202 09:34:18.194626 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" event={"ID":"6b84d8a6-8098-46cd-83cf-860f21f040a0","Type":"ContainerStarted","Data":"88284e79b559622727a9bc22776c65387e1940f50ed4505397bd8848aad92165"} Dec 02 09:34:18 crc kubenswrapper[4895]: I1202 09:34:18.196778 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" event={"ID":"6b84d8a6-8098-46cd-83cf-860f21f040a0","Type":"ContainerStarted","Data":"bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a"} Dec 02 09:34:18 crc kubenswrapper[4895]: I1202 09:34:18.222016 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" podStartSLOduration=2.0181077099999998 podStartE2EDuration="2.221982328s" podCreationTimestamp="2025-12-02 09:34:16 +0000 UTC" firstStartedPulling="2025-12-02 09:34:17.179526191 +0000 UTC m=+7868.350385804" lastFinishedPulling="2025-12-02 09:34:17.383400809 +0000 UTC m=+7868.554260422" observedRunningTime="2025-12-02 09:34:18.211061848 +0000 UTC m=+7869.381921471" watchObservedRunningTime="2025-12-02 09:34:18.221982328 +0000 UTC m=+7869.392841941" Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.203332 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w9hm8" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="registry-server" containerID="cri-o://5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1" gracePeriod=2 Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.741734 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.924417 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnrst\" (UniqueName: \"kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst\") pod \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.924506 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities\") pod \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.924870 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content\") pod \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\" (UID: \"18211ca2-5d6e-4afc-9b4e-888c21ee355b\") " Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.925353 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities" (OuterVolumeSpecName: "utilities") pod "18211ca2-5d6e-4afc-9b4e-888c21ee355b" (UID: "18211ca2-5d6e-4afc-9b4e-888c21ee355b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:19 crc kubenswrapper[4895]: I1202 09:34:19.931094 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst" (OuterVolumeSpecName: "kube-api-access-wnrst") pod "18211ca2-5d6e-4afc-9b4e-888c21ee355b" (UID: "18211ca2-5d6e-4afc-9b4e-888c21ee355b"). InnerVolumeSpecName "kube-api-access-wnrst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.026385 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18211ca2-5d6e-4afc-9b4e-888c21ee355b" (UID: "18211ca2-5d6e-4afc-9b4e-888c21ee355b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.028192 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnrst\" (UniqueName: \"kubernetes.io/projected/18211ca2-5d6e-4afc-9b4e-888c21ee355b-kube-api-access-wnrst\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.028220 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.028229 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18211ca2-5d6e-4afc-9b4e-888c21ee355b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.220427 4895 generic.go:334] "Generic (PLEG): container finished" podID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerID="5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1" exitCode=0 Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.220483 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w9hm8" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.220514 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerDied","Data":"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1"} Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.221865 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w9hm8" event={"ID":"18211ca2-5d6e-4afc-9b4e-888c21ee355b","Type":"ContainerDied","Data":"30177a9b689de6d7185ce082451600b591769da3f41649bbf9ce69d72309771d"} Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.221897 4895 scope.go:117] "RemoveContainer" containerID="5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.249687 4895 scope.go:117] "RemoveContainer" containerID="5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.258496 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.277849 4895 scope.go:117] "RemoveContainer" containerID="badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.286165 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w9hm8"] Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.323872 4895 scope.go:117] "RemoveContainer" containerID="5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1" Dec 02 09:34:20 crc kubenswrapper[4895]: E1202 09:34:20.324283 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1\": container with ID starting with 5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1 not found: ID does not exist" containerID="5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.324331 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1"} err="failed to get container status \"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1\": rpc error: code = NotFound desc = could not find container \"5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1\": container with ID starting with 5e58acf1a909f055def784ee71be8aa0cbd6af0a73a1cef43dc4e055b5cbf0b1 not found: ID does not exist" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.324361 4895 scope.go:117] "RemoveContainer" containerID="5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8" Dec 02 09:34:20 crc kubenswrapper[4895]: E1202 09:34:20.325082 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8\": container with ID starting with 5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8 not found: ID does not exist" containerID="5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.325239 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8"} err="failed to get container status \"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8\": rpc error: code = NotFound desc = could not find container \"5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8\": container with ID starting with 5fc18d832d32925624af7656fc823450c6f19ebcd7f393b8e7127641c420eaf8 not found: ID does not exist" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.325343 4895 scope.go:117] "RemoveContainer" containerID="badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b" Dec 02 09:34:20 crc kubenswrapper[4895]: E1202 09:34:20.325728 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b\": container with ID starting with badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b not found: ID does not exist" containerID="badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b" Dec 02 09:34:20 crc kubenswrapper[4895]: I1202 09:34:20.326121 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b"} err="failed to get container status \"badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b\": rpc error: code = NotFound desc = could not find container \"badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b\": container with ID starting with badb0ee2c0de8449c616aea0aeefbf83611ca970e23d684766d94f658ea6ed7b not found: ID does not exist" Dec 02 09:34:21 crc kubenswrapper[4895]: I1202 09:34:21.154191 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" path="/var/lib/kubelet/pods/18211ca2-5d6e-4afc-9b4e-888c21ee355b/volumes" Dec 02 09:34:24 crc kubenswrapper[4895]: I1202 09:34:24.140972 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:34:24 crc kubenswrapper[4895]: E1202 09:34:24.141643 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:34:37 crc kubenswrapper[4895]: I1202 09:34:37.145648 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:34:37 crc kubenswrapper[4895]: E1202 09:34:37.146880 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:34:49 crc kubenswrapper[4895]: I1202 09:34:49.149187 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:34:49 crc kubenswrapper[4895]: E1202 09:34:49.150129 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:35:01 crc kubenswrapper[4895]: I1202 09:35:01.141509 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:35:01 crc kubenswrapper[4895]: E1202 09:35:01.142690 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:35:15 crc kubenswrapper[4895]: I1202 09:35:15.141800 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:35:15 crc kubenswrapper[4895]: E1202 09:35:15.142692 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:35:26 crc kubenswrapper[4895]: I1202 09:35:26.141810 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:35:26 crc kubenswrapper[4895]: E1202 09:35:26.142677 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:35:40 crc kubenswrapper[4895]: I1202 09:35:40.142188 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:35:40 crc kubenswrapper[4895]: E1202 09:35:40.143111 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:35:51 crc kubenswrapper[4895]: I1202 09:35:51.141380 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:35:51 crc kubenswrapper[4895]: E1202 09:35:51.142252 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:36:04 crc kubenswrapper[4895]: I1202 09:36:04.140895 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:36:04 crc kubenswrapper[4895]: E1202 09:36:04.141663 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:36:17 crc kubenswrapper[4895]: I1202 09:36:17.141542 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:36:17 crc kubenswrapper[4895]: E1202 09:36:17.142856 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:36:29 crc kubenswrapper[4895]: I1202 09:36:29.148278 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:36:29 crc kubenswrapper[4895]: E1202 09:36:29.149146 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:36:44 crc kubenswrapper[4895]: I1202 09:36:44.141006 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:36:44 crc kubenswrapper[4895]: E1202 09:36:44.141964 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:36:55 crc kubenswrapper[4895]: I1202 09:36:55.141272 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:36:55 crc kubenswrapper[4895]: E1202 09:36:55.142107 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:37:06 crc kubenswrapper[4895]: I1202 09:37:06.140946 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:37:06 crc kubenswrapper[4895]: E1202 09:37:06.143218 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:37:17 crc kubenswrapper[4895]: I1202 09:37:17.141784 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:37:17 crc kubenswrapper[4895]: E1202 09:37:17.142646 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:37:28 crc kubenswrapper[4895]: I1202 09:37:28.142762 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:37:28 crc kubenswrapper[4895]: E1202 09:37:28.143458 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:37:41 crc kubenswrapper[4895]: I1202 09:37:41.141327 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:37:41 crc kubenswrapper[4895]: I1202 09:37:41.543341 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422"} Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.528958 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:23 crc kubenswrapper[4895]: E1202 09:38:23.530071 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="registry-server" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.530089 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="registry-server" Dec 02 09:38:23 crc kubenswrapper[4895]: E1202 09:38:23.530108 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="extract-utilities" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.530115 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="extract-utilities" Dec 02 09:38:23 crc kubenswrapper[4895]: E1202 09:38:23.530124 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="extract-content" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.530131 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="extract-content" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.530349 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="18211ca2-5d6e-4afc-9b4e-888c21ee355b" containerName="registry-server" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.534827 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.567025 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.688065 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsc67\" (UniqueName: \"kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.688183 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.688230 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.789928 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsc67\" (UniqueName: \"kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.790375 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.790411 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.791173 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.791671 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.815413 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsc67\" (UniqueName: \"kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67\") pod \"certified-operators-9v7xl\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:23 crc kubenswrapper[4895]: I1202 09:38:23.877732 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:24 crc kubenswrapper[4895]: I1202 09:38:24.505342 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:24 crc kubenswrapper[4895]: I1202 09:38:24.984647 4895 generic.go:334] "Generic (PLEG): container finished" podID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerID="f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19" exitCode=0 Dec 02 09:38:24 crc kubenswrapper[4895]: I1202 09:38:24.984716 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerDied","Data":"f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19"} Dec 02 09:38:24 crc kubenswrapper[4895]: I1202 09:38:24.985026 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerStarted","Data":"ec20100e56f5c8c66b14a26da5e9c467f2634c2371def6a19922fe9b37553097"} Dec 02 09:38:26 crc kubenswrapper[4895]: I1202 09:38:26.004659 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerStarted","Data":"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145"} Dec 02 09:38:27 crc kubenswrapper[4895]: I1202 09:38:27.015333 4895 generic.go:334] "Generic (PLEG): container finished" podID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerID="fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145" exitCode=0 Dec 02 09:38:27 crc kubenswrapper[4895]: I1202 09:38:27.015881 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerDied","Data":"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145"} Dec 02 09:38:28 crc kubenswrapper[4895]: I1202 09:38:28.029160 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerStarted","Data":"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558"} Dec 02 09:38:28 crc kubenswrapper[4895]: I1202 09:38:28.051479 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9v7xl" podStartSLOduration=2.3225934710000002 podStartE2EDuration="5.051460744s" podCreationTimestamp="2025-12-02 09:38:23 +0000 UTC" firstStartedPulling="2025-12-02 09:38:24.986945651 +0000 UTC m=+8116.157805264" lastFinishedPulling="2025-12-02 09:38:27.715812924 +0000 UTC m=+8118.886672537" observedRunningTime="2025-12-02 09:38:28.048274625 +0000 UTC m=+8119.219134268" watchObservedRunningTime="2025-12-02 09:38:28.051460744 +0000 UTC m=+8119.222320347" Dec 02 09:38:33 crc kubenswrapper[4895]: I1202 09:38:33.878589 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:33 crc kubenswrapper[4895]: I1202 09:38:33.879221 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:33 crc kubenswrapper[4895]: I1202 09:38:33.928671 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:34 crc kubenswrapper[4895]: I1202 09:38:34.137589 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:34 crc kubenswrapper[4895]: I1202 09:38:34.186979 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.108022 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9v7xl" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="registry-server" containerID="cri-o://1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558" gracePeriod=2 Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.693854 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.801853 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsc67\" (UniqueName: \"kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67\") pod \"82b05dd2-53cb-402b-be15-a4e86ac6123c\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.802302 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content\") pod \"82b05dd2-53cb-402b-be15-a4e86ac6123c\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.802530 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities\") pod \"82b05dd2-53cb-402b-be15-a4e86ac6123c\" (UID: \"82b05dd2-53cb-402b-be15-a4e86ac6123c\") " Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.804784 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities" (OuterVolumeSpecName: "utilities") pod "82b05dd2-53cb-402b-be15-a4e86ac6123c" (UID: "82b05dd2-53cb-402b-be15-a4e86ac6123c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.810062 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67" (OuterVolumeSpecName: "kube-api-access-gsc67") pod "82b05dd2-53cb-402b-be15-a4e86ac6123c" (UID: "82b05dd2-53cb-402b-be15-a4e86ac6123c"). InnerVolumeSpecName "kube-api-access-gsc67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.854844 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82b05dd2-53cb-402b-be15-a4e86ac6123c" (UID: "82b05dd2-53cb-402b-be15-a4e86ac6123c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.905421 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.905452 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82b05dd2-53cb-402b-be15-a4e86ac6123c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:36 crc kubenswrapper[4895]: I1202 09:38:36.905462 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsc67\" (UniqueName: \"kubernetes.io/projected/82b05dd2-53cb-402b-be15-a4e86ac6123c-kube-api-access-gsc67\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.119318 4895 generic.go:334] "Generic (PLEG): container finished" podID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerID="1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558" exitCode=0 Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.119360 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerDied","Data":"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558"} Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.119394 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9v7xl" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.119419 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9v7xl" event={"ID":"82b05dd2-53cb-402b-be15-a4e86ac6123c","Type":"ContainerDied","Data":"ec20100e56f5c8c66b14a26da5e9c467f2634c2371def6a19922fe9b37553097"} Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.119439 4895 scope.go:117] "RemoveContainer" containerID="1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.149997 4895 scope.go:117] "RemoveContainer" containerID="fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.158250 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.167996 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9v7xl"] Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.192058 4895 scope.go:117] "RemoveContainer" containerID="f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.222718 4895 scope.go:117] "RemoveContainer" containerID="1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558" Dec 02 09:38:37 crc kubenswrapper[4895]: E1202 09:38:37.223167 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558\": container with ID starting with 1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558 not found: ID does not exist" containerID="1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.223205 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558"} err="failed to get container status \"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558\": rpc error: code = NotFound desc = could not find container \"1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558\": container with ID starting with 1b740ddee912329a27e34e9055531a57abf0e189b5cfa0b6d5c9d2462d937558 not found: ID does not exist" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.223234 4895 scope.go:117] "RemoveContainer" containerID="fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145" Dec 02 09:38:37 crc kubenswrapper[4895]: E1202 09:38:37.223674 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145\": container with ID starting with fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145 not found: ID does not exist" containerID="fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.223697 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145"} err="failed to get container status \"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145\": rpc error: code = NotFound desc = could not find container \"fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145\": container with ID starting with fd4edddd14555232a7c0bf5da73dc23e62649579c3d911b9ded8dbed8e9fa145 not found: ID does not exist" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.223712 4895 scope.go:117] "RemoveContainer" containerID="f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19" Dec 02 09:38:37 crc kubenswrapper[4895]: E1202 09:38:37.223986 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19\": container with ID starting with f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19 not found: ID does not exist" containerID="f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19" Dec 02 09:38:37 crc kubenswrapper[4895]: I1202 09:38:37.224024 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19"} err="failed to get container status \"f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19\": rpc error: code = NotFound desc = could not find container \"f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19\": container with ID starting with f8565cb604962e2c2e3fbfe1890dac1021323ea44c810c117d1b2ec6cb0c0d19 not found: ID does not exist" Dec 02 09:38:39 crc kubenswrapper[4895]: I1202 09:38:39.154169 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" path="/var/lib/kubelet/pods/82b05dd2-53cb-402b-be15-a4e86ac6123c/volumes" Dec 02 09:38:54 crc kubenswrapper[4895]: I1202 09:38:54.331558 4895 generic.go:334] "Generic (PLEG): container finished" podID="6b84d8a6-8098-46cd-83cf-860f21f040a0" containerID="88284e79b559622727a9bc22776c65387e1940f50ed4505397bd8848aad92165" exitCode=0 Dec 02 09:38:54 crc kubenswrapper[4895]: I1202 09:38:54.331648 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" event={"ID":"6b84d8a6-8098-46cd-83cf-860f21f040a0","Type":"ContainerDied","Data":"88284e79b559622727a9bc22776c65387e1940f50ed4505397bd8848aad92165"} Dec 02 09:38:55 crc kubenswrapper[4895]: I1202 09:38:55.853451 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007582 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007715 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhz9b\" (UniqueName: \"kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007741 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007783 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007854 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.007876 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle\") pod \"6b84d8a6-8098-46cd-83cf-860f21f040a0\" (UID: \"6b84d8a6-8098-46cd-83cf-860f21f040a0\") " Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.013484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.013632 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b" (OuterVolumeSpecName: "kube-api-access-dhz9b") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "kube-api-access-dhz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.014693 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph" (OuterVolumeSpecName: "ceph") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.041757 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory" (OuterVolumeSpecName: "inventory") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.041780 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.045506 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "6b84d8a6-8098-46cd-83cf-860f21f040a0" (UID: "6b84d8a6-8098-46cd-83cf-860f21f040a0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.110896 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.110970 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.110987 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.110998 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhz9b\" (UniqueName: \"kubernetes.io/projected/6b84d8a6-8098-46cd-83cf-860f21f040a0-kube-api-access-dhz9b\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.111011 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.111022 4895 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/6b84d8a6-8098-46cd-83cf-860f21f040a0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.352557 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" event={"ID":"6b84d8a6-8098-46cd-83cf-860f21f040a0","Type":"ContainerDied","Data":"bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a"} Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.352612 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab7ee965ef9c7610109d8a5c4888ae8c717f32ee48bc7f042f4f1e252fab12a" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.352648 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kd4nz" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.465922 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bmnqb"] Dec 02 09:38:56 crc kubenswrapper[4895]: E1202 09:38:56.467468 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="extract-content" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.467584 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="extract-content" Dec 02 09:38:56 crc kubenswrapper[4895]: E1202 09:38:56.467666 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="registry-server" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.467730 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="registry-server" Dec 02 09:38:56 crc kubenswrapper[4895]: E1202 09:38:56.467841 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b84d8a6-8098-46cd-83cf-860f21f040a0" containerName="libvirt-openstack-openstack-cell1" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.467906 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b84d8a6-8098-46cd-83cf-860f21f040a0" containerName="libvirt-openstack-openstack-cell1" Dec 02 09:38:56 crc kubenswrapper[4895]: E1202 09:38:56.467972 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="extract-utilities" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.468036 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="extract-utilities" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.468477 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b84d8a6-8098-46cd-83cf-860f21f040a0" containerName="libvirt-openstack-openstack-cell1" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.468574 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b05dd2-53cb-402b-be15-a4e86ac6123c" containerName="registry-server" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.471446 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.474493 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.474792 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.474967 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.475533 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.476285 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.476319 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.476315 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.494572 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bmnqb"] Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.620261 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.620332 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.620589 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.620877 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621009 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621105 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621153 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621190 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrs9\" (UniqueName: \"kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621213 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621276 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.621301 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723296 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723377 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723416 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723455 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723479 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723506 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrs9\" (UniqueName: \"kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723520 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723562 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723580 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723621 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.723645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.724909 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.724934 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.728285 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.728701 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.728882 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.730109 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.731202 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.731542 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.731575 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.737058 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.747450 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrs9\" (UniqueName: \"kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9\") pod \"nova-cell1-openstack-openstack-cell1-bmnqb\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:56 crc kubenswrapper[4895]: I1202 09:38:56.787400 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:38:57 crc kubenswrapper[4895]: I1202 09:38:57.324984 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bmnqb"] Dec 02 09:38:57 crc kubenswrapper[4895]: I1202 09:38:57.363099 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" event={"ID":"0f16ba88-ef94-4543-aef6-85263b26ff4c","Type":"ContainerStarted","Data":"b41b6466b98a8e9641fb53c73ed57968030af61b81bebfd6c04d78c9506c6cd0"} Dec 02 09:38:58 crc kubenswrapper[4895]: I1202 09:38:58.377469 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" event={"ID":"0f16ba88-ef94-4543-aef6-85263b26ff4c","Type":"ContainerStarted","Data":"ed63aaa42fb82f7e7b958a58b121f343f9f29bac88511c64f54b29a8af047636"} Dec 02 09:38:58 crc kubenswrapper[4895]: I1202 09:38:58.407438 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" podStartSLOduration=2.223440148 podStartE2EDuration="2.407418966s" podCreationTimestamp="2025-12-02 09:38:56 +0000 UTC" firstStartedPulling="2025-12-02 09:38:57.329291879 +0000 UTC m=+8148.500151492" lastFinishedPulling="2025-12-02 09:38:57.513270697 +0000 UTC m=+8148.684130310" observedRunningTime="2025-12-02 09:38:58.397709163 +0000 UTC m=+8149.568568806" watchObservedRunningTime="2025-12-02 09:38:58.407418966 +0000 UTC m=+8149.578278579" Dec 02 09:40:05 crc kubenswrapper[4895]: I1202 09:40:05.473234 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:40:05 crc kubenswrapper[4895]: I1202 09:40:05.473754 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:40:35 crc kubenswrapper[4895]: I1202 09:40:35.473483 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:40:35 crc kubenswrapper[4895]: I1202 09:40:35.474459 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.168682 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.171912 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.191612 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.301779 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cm45\" (UniqueName: \"kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.302216 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.302329 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.405725 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.405817 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.406028 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cm45\" (UniqueName: \"kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.406308 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.406373 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.428688 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cm45\" (UniqueName: \"kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45\") pod \"redhat-marketplace-wpfbs\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:00 crc kubenswrapper[4895]: I1202 09:41:00.501550 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:01 crc kubenswrapper[4895]: I1202 09:41:01.059177 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:01 crc kubenswrapper[4895]: W1202 09:41:01.062846 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158c5737_e1ee_4ccb_a3bd_b1b017df222e.slice/crio-8f5160dc8ee32d16129f16bc7b718bb837e17559decc65e38ffb31cd8ed62416 WatchSource:0}: Error finding container 8f5160dc8ee32d16129f16bc7b718bb837e17559decc65e38ffb31cd8ed62416: Status 404 returned error can't find the container with id 8f5160dc8ee32d16129f16bc7b718bb837e17559decc65e38ffb31cd8ed62416 Dec 02 09:41:01 crc kubenswrapper[4895]: I1202 09:41:01.779839 4895 generic.go:334] "Generic (PLEG): container finished" podID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerID="c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77" exitCode=0 Dec 02 09:41:01 crc kubenswrapper[4895]: I1202 09:41:01.779876 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerDied","Data":"c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77"} Dec 02 09:41:01 crc kubenswrapper[4895]: I1202 09:41:01.780179 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerStarted","Data":"8f5160dc8ee32d16129f16bc7b718bb837e17559decc65e38ffb31cd8ed62416"} Dec 02 09:41:01 crc kubenswrapper[4895]: I1202 09:41:01.782891 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:41:03 crc kubenswrapper[4895]: I1202 09:41:03.806720 4895 generic.go:334] "Generic (PLEG): container finished" podID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerID="02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708" exitCode=0 Dec 02 09:41:03 crc kubenswrapper[4895]: I1202 09:41:03.806970 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerDied","Data":"02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708"} Dec 02 09:41:04 crc kubenswrapper[4895]: I1202 09:41:04.820663 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerStarted","Data":"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42"} Dec 02 09:41:04 crc kubenswrapper[4895]: I1202 09:41:04.843780 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpfbs" podStartSLOduration=2.369421254 podStartE2EDuration="4.843754977s" podCreationTimestamp="2025-12-02 09:41:00 +0000 UTC" firstStartedPulling="2025-12-02 09:41:01.782677524 +0000 UTC m=+8272.953537137" lastFinishedPulling="2025-12-02 09:41:04.257011247 +0000 UTC m=+8275.427870860" observedRunningTime="2025-12-02 09:41:04.837404939 +0000 UTC m=+8276.008264552" watchObservedRunningTime="2025-12-02 09:41:04.843754977 +0000 UTC m=+8276.014614590" Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.473402 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.473488 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.473548 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.474594 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.474679 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422" gracePeriod=600 Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.856145 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422" exitCode=0 Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.858952 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422"} Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.858992 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d"} Dec 02 09:41:05 crc kubenswrapper[4895]: I1202 09:41:05.859030 4895 scope.go:117] "RemoveContainer" containerID="77223bf7853202dbb8024d476b05d3dd47cc2c7476c34d4d3d8c2d6a90d37b0b" Dec 02 09:41:10 crc kubenswrapper[4895]: I1202 09:41:10.502776 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:10 crc kubenswrapper[4895]: I1202 09:41:10.503393 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:10 crc kubenswrapper[4895]: I1202 09:41:10.565826 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:10 crc kubenswrapper[4895]: I1202 09:41:10.977555 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:11 crc kubenswrapper[4895]: I1202 09:41:11.043936 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:12 crc kubenswrapper[4895]: I1202 09:41:12.941651 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpfbs" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="registry-server" containerID="cri-o://65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42" gracePeriod=2 Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.488002 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.515835 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cm45\" (UniqueName: \"kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45\") pod \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.516234 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content\") pod \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.516374 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities\") pod \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\" (UID: \"158c5737-e1ee-4ccb-a3bd-b1b017df222e\") " Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.518101 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities" (OuterVolumeSpecName: "utilities") pod "158c5737-e1ee-4ccb-a3bd-b1b017df222e" (UID: "158c5737-e1ee-4ccb-a3bd-b1b017df222e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.532097 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45" (OuterVolumeSpecName: "kube-api-access-9cm45") pod "158c5737-e1ee-4ccb-a3bd-b1b017df222e" (UID: "158c5737-e1ee-4ccb-a3bd-b1b017df222e"). InnerVolumeSpecName "kube-api-access-9cm45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.538687 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "158c5737-e1ee-4ccb-a3bd-b1b017df222e" (UID: "158c5737-e1ee-4ccb-a3bd-b1b017df222e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.618899 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.618931 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cm45\" (UniqueName: \"kubernetes.io/projected/158c5737-e1ee-4ccb-a3bd-b1b017df222e-kube-api-access-9cm45\") on node \"crc\" DevicePath \"\"" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.618941 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158c5737-e1ee-4ccb-a3bd-b1b017df222e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.951447 4895 generic.go:334] "Generic (PLEG): container finished" podID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerID="65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42" exitCode=0 Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.951487 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerDied","Data":"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42"} Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.951509 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpfbs" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.951530 4895 scope.go:117] "RemoveContainer" containerID="65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.951519 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpfbs" event={"ID":"158c5737-e1ee-4ccb-a3bd-b1b017df222e","Type":"ContainerDied","Data":"8f5160dc8ee32d16129f16bc7b718bb837e17559decc65e38ffb31cd8ed62416"} Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.978367 4895 scope.go:117] "RemoveContainer" containerID="02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708" Dec 02 09:41:13 crc kubenswrapper[4895]: I1202 09:41:13.992432 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.003813 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpfbs"] Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.015546 4895 scope.go:117] "RemoveContainer" containerID="c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.050068 4895 scope.go:117] "RemoveContainer" containerID="65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42" Dec 02 09:41:14 crc kubenswrapper[4895]: E1202 09:41:14.050500 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42\": container with ID starting with 65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42 not found: ID does not exist" containerID="65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.050558 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42"} err="failed to get container status \"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42\": rpc error: code = NotFound desc = could not find container \"65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42\": container with ID starting with 65d2ac10825cdf24659f41c7af605e9fc7a21eb405f793fbd5f4b686f513da42 not found: ID does not exist" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.050595 4895 scope.go:117] "RemoveContainer" containerID="02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708" Dec 02 09:41:14 crc kubenswrapper[4895]: E1202 09:41:14.050959 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708\": container with ID starting with 02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708 not found: ID does not exist" containerID="02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.051045 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708"} err="failed to get container status \"02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708\": rpc error: code = NotFound desc = could not find container \"02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708\": container with ID starting with 02ac7323377d3d6353eb31ac020b1986a034fc80e61818fab782b83caa7fe708 not found: ID does not exist" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.051121 4895 scope.go:117] "RemoveContainer" containerID="c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77" Dec 02 09:41:14 crc kubenswrapper[4895]: E1202 09:41:14.051378 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77\": container with ID starting with c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77 not found: ID does not exist" containerID="c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77" Dec 02 09:41:14 crc kubenswrapper[4895]: I1202 09:41:14.051414 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77"} err="failed to get container status \"c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77\": rpc error: code = NotFound desc = could not find container \"c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77\": container with ID starting with c1a92adaba6fec1627fc3cd72ed778f20a6829932648b5d3becdf9b4ca9fea77 not found: ID does not exist" Dec 02 09:41:15 crc kubenswrapper[4895]: I1202 09:41:15.152724 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" path="/var/lib/kubelet/pods/158c5737-e1ee-4ccb-a3bd-b1b017df222e/volumes" Dec 02 09:42:13 crc kubenswrapper[4895]: I1202 09:42:13.533275 4895 generic.go:334] "Generic (PLEG): container finished" podID="0f16ba88-ef94-4543-aef6-85263b26ff4c" containerID="ed63aaa42fb82f7e7b958a58b121f343f9f29bac88511c64f54b29a8af047636" exitCode=0 Dec 02 09:42:13 crc kubenswrapper[4895]: I1202 09:42:13.533493 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" event={"ID":"0f16ba88-ef94-4543-aef6-85263b26ff4c","Type":"ContainerDied","Data":"ed63aaa42fb82f7e7b958a58b121f343f9f29bac88511c64f54b29a8af047636"} Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:14.988412 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.099624 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.099724 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.099860 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.099927 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.099980 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100032 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100068 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100114 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrs9\" (UniqueName: \"kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100646 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100678 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.100712 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph\") pod \"0f16ba88-ef94-4543-aef6-85263b26ff4c\" (UID: \"0f16ba88-ef94-4543-aef6-85263b26ff4c\") " Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.105894 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph" (OuterVolumeSpecName: "ceph") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.110965 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9" (OuterVolumeSpecName: "kube-api-access-2vrs9") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "kube-api-access-2vrs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.116983 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.137809 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.138882 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.141192 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.141862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.144431 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.146076 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.156087 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory" (OuterVolumeSpecName: "inventory") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.156465 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0f16ba88-ef94-4543-aef6-85263b26ff4c" (UID: "0f16ba88-ef94-4543-aef6-85263b26ff4c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205515 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205689 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205702 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205711 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205723 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205733 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205765 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205777 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0f16ba88-ef94-4543-aef6-85263b26ff4c-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205787 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205796 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f16ba88-ef94-4543-aef6-85263b26ff4c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.205806 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrs9\" (UniqueName: \"kubernetes.io/projected/0f16ba88-ef94-4543-aef6-85263b26ff4c-kube-api-access-2vrs9\") on node \"crc\" DevicePath \"\"" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.556107 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" event={"ID":"0f16ba88-ef94-4543-aef6-85263b26ff4c","Type":"ContainerDied","Data":"b41b6466b98a8e9641fb53c73ed57968030af61b81bebfd6c04d78c9506c6cd0"} Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.556152 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41b6466b98a8e9641fb53c73ed57968030af61b81bebfd6c04d78c9506c6cd0" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.556216 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bmnqb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.650026 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tpkb"] Dec 02 09:42:15 crc kubenswrapper[4895]: E1202 09:42:15.651302 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f16ba88-ef94-4543-aef6-85263b26ff4c" containerName="nova-cell1-openstack-openstack-cell1" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651322 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f16ba88-ef94-4543-aef6-85263b26ff4c" containerName="nova-cell1-openstack-openstack-cell1" Dec 02 09:42:15 crc kubenswrapper[4895]: E1202 09:42:15.651350 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="extract-content" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651358 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="extract-content" Dec 02 09:42:15 crc kubenswrapper[4895]: E1202 09:42:15.651376 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="registry-server" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651382 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="registry-server" Dec 02 09:42:15 crc kubenswrapper[4895]: E1202 09:42:15.651403 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="extract-utilities" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651419 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="extract-utilities" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651648 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="158c5737-e1ee-4ccb-a3bd-b1b017df222e" containerName="registry-server" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.651671 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f16ba88-ef94-4543-aef6-85263b26ff4c" containerName="nova-cell1-openstack-openstack-cell1" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.652514 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.658319 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.658515 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.658756 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.659453 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.661704 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.664894 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tpkb"] Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736322 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736416 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736452 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736482 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736539 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736627 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpz6w\" (UniqueName: \"kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736664 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.736711 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838495 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838588 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838624 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838660 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838714 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838813 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpz6w\" (UniqueName: \"kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838844 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.838888 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.846147 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.846269 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.847074 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.848822 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.855658 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.855986 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.856404 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.863698 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpz6w\" (UniqueName: \"kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w\") pod \"telemetry-openstack-openstack-cell1-8tpkb\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:15 crc kubenswrapper[4895]: I1202 09:42:15.972829 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:42:16 crc kubenswrapper[4895]: I1202 09:42:16.588151 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-8tpkb"] Dec 02 09:42:17 crc kubenswrapper[4895]: I1202 09:42:17.579247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" event={"ID":"2cf8bdc2-2981-4069-aa5c-de35a6d4a246","Type":"ContainerStarted","Data":"b4c2f617186927453f9032727454eb3305c86332bccefea515ef1fac17f9f697"} Dec 02 09:42:17 crc kubenswrapper[4895]: I1202 09:42:17.580047 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" event={"ID":"2cf8bdc2-2981-4069-aa5c-de35a6d4a246","Type":"ContainerStarted","Data":"f276847a1c4b7d43f9de698b4545967dd1a47772d36e78601b8f5a84a2893365"} Dec 02 09:42:17 crc kubenswrapper[4895]: I1202 09:42:17.608895 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" podStartSLOduration=2.425913142 podStartE2EDuration="2.608875838s" podCreationTimestamp="2025-12-02 09:42:15 +0000 UTC" firstStartedPulling="2025-12-02 09:42:16.594203785 +0000 UTC m=+8347.765063398" lastFinishedPulling="2025-12-02 09:42:16.777166481 +0000 UTC m=+8347.948026094" observedRunningTime="2025-12-02 09:42:17.602615293 +0000 UTC m=+8348.773474916" watchObservedRunningTime="2025-12-02 09:42:17.608875838 +0000 UTC m=+8348.779735441" Dec 02 09:43:05 crc kubenswrapper[4895]: I1202 09:43:05.473559 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:43:05 crc kubenswrapper[4895]: I1202 09:43:05.474327 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:43:35 crc kubenswrapper[4895]: I1202 09:43:35.473080 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:43:35 crc kubenswrapper[4895]: I1202 09:43:35.473875 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.473946 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.474494 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.474542 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.475304 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.475359 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" gracePeriod=600 Dec 02 09:44:05 crc kubenswrapper[4895]: E1202 09:44:05.601410 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.677713 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" exitCode=0 Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.677764 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d"} Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.677808 4895 scope.go:117] "RemoveContainer" containerID="17b4ce278cfbacb2db258cc73695bc893172418f5a9a87ae12f5dacd0ff48422" Dec 02 09:44:05 crc kubenswrapper[4895]: I1202 09:44:05.678614 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:44:05 crc kubenswrapper[4895]: E1202 09:44:05.678906 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:17 crc kubenswrapper[4895]: I1202 09:44:17.143473 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:44:17 crc kubenswrapper[4895]: E1202 09:44:17.145253 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:28 crc kubenswrapper[4895]: I1202 09:44:28.141404 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:44:28 crc kubenswrapper[4895]: E1202 09:44:28.142275 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:40 crc kubenswrapper[4895]: I1202 09:44:40.141681 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:44:40 crc kubenswrapper[4895]: E1202 09:44:40.142492 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.817361 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.831419 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.860392 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.919449 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.919737 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:51 crc kubenswrapper[4895]: I1202 09:44:51.919818 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsm8\" (UniqueName: \"kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.022756 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.022148 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.022895 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsm8\" (UniqueName: \"kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.023430 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.023714 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.042229 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsm8\" (UniqueName: \"kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8\") pod \"community-operators-t9vzt\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.161150 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:44:52 crc kubenswrapper[4895]: I1202 09:44:52.763033 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:44:53 crc kubenswrapper[4895]: I1202 09:44:53.211428 4895 generic.go:334] "Generic (PLEG): container finished" podID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerID="10fcee9619c406913331fef9138290c70a50291f959319667320a8c92091b7b2" exitCode=0 Dec 02 09:44:53 crc kubenswrapper[4895]: I1202 09:44:53.211753 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerDied","Data":"10fcee9619c406913331fef9138290c70a50291f959319667320a8c92091b7b2"} Dec 02 09:44:53 crc kubenswrapper[4895]: I1202 09:44:53.211788 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerStarted","Data":"76e143d5bde1cbec183af70bd63d8e1bb0ad8881a5d960bcd1d069f2ad6e7ab9"} Dec 02 09:44:54 crc kubenswrapper[4895]: I1202 09:44:54.222796 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerStarted","Data":"cc2cdd3864e0ca17c0b5d3b3f85a9a518769e2558e27f0916d9d51fb1ed35341"} Dec 02 09:44:55 crc kubenswrapper[4895]: I1202 09:44:55.141435 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:44:55 crc kubenswrapper[4895]: E1202 09:44:55.142391 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:44:55 crc kubenswrapper[4895]: I1202 09:44:55.234539 4895 generic.go:334] "Generic (PLEG): container finished" podID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerID="cc2cdd3864e0ca17c0b5d3b3f85a9a518769e2558e27f0916d9d51fb1ed35341" exitCode=0 Dec 02 09:44:55 crc kubenswrapper[4895]: I1202 09:44:55.234586 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerDied","Data":"cc2cdd3864e0ca17c0b5d3b3f85a9a518769e2558e27f0916d9d51fb1ed35341"} Dec 02 09:44:56 crc kubenswrapper[4895]: I1202 09:44:56.247654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerStarted","Data":"39cce3721e416746ce3e9c7128a2f483c3a52201ce0f93ed49df36b75fdbef31"} Dec 02 09:44:56 crc kubenswrapper[4895]: I1202 09:44:56.284042 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t9vzt" podStartSLOduration=2.741984645 podStartE2EDuration="5.284015627s" podCreationTimestamp="2025-12-02 09:44:51 +0000 UTC" firstStartedPulling="2025-12-02 09:44:53.213662745 +0000 UTC m=+8504.384522358" lastFinishedPulling="2025-12-02 09:44:55.755693727 +0000 UTC m=+8506.926553340" observedRunningTime="2025-12-02 09:44:56.274647645 +0000 UTC m=+8507.445507278" watchObservedRunningTime="2025-12-02 09:44:56.284015627 +0000 UTC m=+8507.454875240" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.149116 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq"] Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.151679 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.154259 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.154505 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.210248 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq"] Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.303157 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.303310 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.303542 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv9bx\" (UniqueName: \"kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.405903 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv9bx\" (UniqueName: \"kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.406082 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.406122 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.407989 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.413013 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.423841 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv9bx\" (UniqueName: \"kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx\") pod \"collect-profiles-29411145-z66xq\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.526220 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:00 crc kubenswrapper[4895]: I1202 09:45:00.991554 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq"] Dec 02 09:45:01 crc kubenswrapper[4895]: I1202 09:45:01.304325 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" event={"ID":"d5f1c442-ad57-4345-87d9-bb98a6140a45","Type":"ContainerStarted","Data":"35b5a54bc863b27ec4f9701bbabc5bf53eadf563a8ba5d94f4521c87d73a7e0f"} Dec 02 09:45:01 crc kubenswrapper[4895]: I1202 09:45:01.305613 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" event={"ID":"d5f1c442-ad57-4345-87d9-bb98a6140a45","Type":"ContainerStarted","Data":"831ed9986a20439ecf0fa30b934b364d7a2414ff528650ae675405c12705ac64"} Dec 02 09:45:01 crc kubenswrapper[4895]: I1202 09:45:01.330493 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" podStartSLOduration=1.330471639 podStartE2EDuration="1.330471639s" podCreationTimestamp="2025-12-02 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:45:01.321947484 +0000 UTC m=+8512.492807097" watchObservedRunningTime="2025-12-02 09:45:01.330471639 +0000 UTC m=+8512.501331252" Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.161363 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.161670 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.213051 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.317518 4895 generic.go:334] "Generic (PLEG): container finished" podID="d5f1c442-ad57-4345-87d9-bb98a6140a45" containerID="35b5a54bc863b27ec4f9701bbabc5bf53eadf563a8ba5d94f4521c87d73a7e0f" exitCode=0 Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.317653 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" event={"ID":"d5f1c442-ad57-4345-87d9-bb98a6140a45","Type":"ContainerDied","Data":"35b5a54bc863b27ec4f9701bbabc5bf53eadf563a8ba5d94f4521c87d73a7e0f"} Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.369551 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:02 crc kubenswrapper[4895]: I1202 09:45:02.450567 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.798136 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.881055 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume\") pod \"d5f1c442-ad57-4345-87d9-bb98a6140a45\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.881163 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume\") pod \"d5f1c442-ad57-4345-87d9-bb98a6140a45\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.881246 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv9bx\" (UniqueName: \"kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx\") pod \"d5f1c442-ad57-4345-87d9-bb98a6140a45\" (UID: \"d5f1c442-ad57-4345-87d9-bb98a6140a45\") " Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.881930 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5f1c442-ad57-4345-87d9-bb98a6140a45" (UID: "d5f1c442-ad57-4345-87d9-bb98a6140a45"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.887168 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5f1c442-ad57-4345-87d9-bb98a6140a45" (UID: "d5f1c442-ad57-4345-87d9-bb98a6140a45"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.888709 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx" (OuterVolumeSpecName: "kube-api-access-zv9bx") pod "d5f1c442-ad57-4345-87d9-bb98a6140a45" (UID: "d5f1c442-ad57-4345-87d9-bb98a6140a45"). InnerVolumeSpecName "kube-api-access-zv9bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.983425 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f1c442-ad57-4345-87d9-bb98a6140a45-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.983466 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv9bx\" (UniqueName: \"kubernetes.io/projected/d5f1c442-ad57-4345-87d9-bb98a6140a45-kube-api-access-zv9bx\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:03 crc kubenswrapper[4895]: I1202 09:45:03.983476 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f1c442-ad57-4345-87d9-bb98a6140a45-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.341127 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t9vzt" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="registry-server" containerID="cri-o://39cce3721e416746ce3e9c7128a2f483c3a52201ce0f93ed49df36b75fdbef31" gracePeriod=2 Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.341241 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.346906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411145-z66xq" event={"ID":"d5f1c442-ad57-4345-87d9-bb98a6140a45","Type":"ContainerDied","Data":"831ed9986a20439ecf0fa30b934b364d7a2414ff528650ae675405c12705ac64"} Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.346986 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831ed9986a20439ecf0fa30b934b364d7a2414ff528650ae675405c12705ac64" Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.439632 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w"] Dec 02 09:45:04 crc kubenswrapper[4895]: I1202 09:45:04.453593 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-qvr2w"] Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.158498 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac99d129-acbe-49c2-8d4f-964620886771" path="/var/lib/kubelet/pods/ac99d129-acbe-49c2-8d4f-964620886771/volumes" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.353946 4895 generic.go:334] "Generic (PLEG): container finished" podID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerID="39cce3721e416746ce3e9c7128a2f483c3a52201ce0f93ed49df36b75fdbef31" exitCode=0 Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.354044 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerDied","Data":"39cce3721e416746ce3e9c7128a2f483c3a52201ce0f93ed49df36b75fdbef31"} Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.354247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9vzt" event={"ID":"733aa7b9-463c-48e5-9776-e4a4fa1ff41d","Type":"ContainerDied","Data":"76e143d5bde1cbec183af70bd63d8e1bb0ad8881a5d960bcd1d069f2ad6e7ab9"} Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.354264 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76e143d5bde1cbec183af70bd63d8e1bb0ad8881a5d960bcd1d069f2ad6e7ab9" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.368418 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.519726 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content\") pod \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.520064 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsm8\" (UniqueName: \"kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8\") pod \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.520226 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities\") pod \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\" (UID: \"733aa7b9-463c-48e5-9776-e4a4fa1ff41d\") " Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.520910 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities" (OuterVolumeSpecName: "utilities") pod "733aa7b9-463c-48e5-9776-e4a4fa1ff41d" (UID: "733aa7b9-463c-48e5-9776-e4a4fa1ff41d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.529776 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8" (OuterVolumeSpecName: "kube-api-access-lxsm8") pod "733aa7b9-463c-48e5-9776-e4a4fa1ff41d" (UID: "733aa7b9-463c-48e5-9776-e4a4fa1ff41d"). InnerVolumeSpecName "kube-api-access-lxsm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.574628 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "733aa7b9-463c-48e5-9776-e4a4fa1ff41d" (UID: "733aa7b9-463c-48e5-9776-e4a4fa1ff41d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.623974 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.624065 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsm8\" (UniqueName: \"kubernetes.io/projected/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-kube-api-access-lxsm8\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:05 crc kubenswrapper[4895]: I1202 09:45:05.624080 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/733aa7b9-463c-48e5-9776-e4a4fa1ff41d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:45:06 crc kubenswrapper[4895]: I1202 09:45:06.374850 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9vzt" Dec 02 09:45:06 crc kubenswrapper[4895]: I1202 09:45:06.416763 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:45:06 crc kubenswrapper[4895]: I1202 09:45:06.430731 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t9vzt"] Dec 02 09:45:07 crc kubenswrapper[4895]: I1202 09:45:07.141308 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:45:07 crc kubenswrapper[4895]: E1202 09:45:07.141626 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:45:07 crc kubenswrapper[4895]: I1202 09:45:07.155055 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" path="/var/lib/kubelet/pods/733aa7b9-463c-48e5-9776-e4a4fa1ff41d/volumes" Dec 02 09:45:19 crc kubenswrapper[4895]: I1202 09:45:19.152504 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:45:19 crc kubenswrapper[4895]: E1202 09:45:19.153606 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:45:32 crc kubenswrapper[4895]: I1202 09:45:32.149230 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:45:32 crc kubenswrapper[4895]: E1202 09:45:32.151054 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:45:39 crc kubenswrapper[4895]: I1202 09:45:39.677862 4895 scope.go:117] "RemoveContainer" containerID="f5f5b477d01498157cb8d4547cb7c47971159e0b38a12aa11d8ffec5ec66a92c" Dec 02 09:45:46 crc kubenswrapper[4895]: I1202 09:45:46.141880 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:45:46 crc kubenswrapper[4895]: E1202 09:45:46.142649 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:45:57 crc kubenswrapper[4895]: I1202 09:45:57.141486 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:45:57 crc kubenswrapper[4895]: E1202 09:45:57.142343 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:46:08 crc kubenswrapper[4895]: I1202 09:46:08.141773 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:46:08 crc kubenswrapper[4895]: E1202 09:46:08.142514 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:46:19 crc kubenswrapper[4895]: I1202 09:46:19.150636 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:46:19 crc kubenswrapper[4895]: E1202 09:46:19.151621 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:46:30 crc kubenswrapper[4895]: I1202 09:46:30.142391 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:46:30 crc kubenswrapper[4895]: E1202 09:46:30.143923 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:46:42 crc kubenswrapper[4895]: I1202 09:46:42.141595 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:46:42 crc kubenswrapper[4895]: E1202 09:46:42.142323 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:46:53 crc kubenswrapper[4895]: I1202 09:46:53.148370 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:46:53 crc kubenswrapper[4895]: E1202 09:46:53.149041 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:47:08 crc kubenswrapper[4895]: I1202 09:47:08.141705 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:47:08 crc kubenswrapper[4895]: E1202 09:47:08.142568 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:47:11 crc kubenswrapper[4895]: I1202 09:47:11.722221 4895 generic.go:334] "Generic (PLEG): container finished" podID="2cf8bdc2-2981-4069-aa5c-de35a6d4a246" containerID="b4c2f617186927453f9032727454eb3305c86332bccefea515ef1fac17f9f697" exitCode=0 Dec 02 09:47:11 crc kubenswrapper[4895]: I1202 09:47:11.722247 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" event={"ID":"2cf8bdc2-2981-4069-aa5c-de35a6d4a246","Type":"ContainerDied","Data":"b4c2f617186927453f9032727454eb3305c86332bccefea515ef1fac17f9f697"} Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.238258 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.372307 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.372862 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.372995 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.373026 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.373066 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.373106 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.373167 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpz6w\" (UniqueName: \"kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.373271 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key\") pod \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\" (UID: \"2cf8bdc2-2981-4069-aa5c-de35a6d4a246\") " Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.380435 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph" (OuterVolumeSpecName: "ceph") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.390973 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.402707 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w" (OuterVolumeSpecName: "kube-api-access-gpz6w") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "kube-api-access-gpz6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.422893 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.428872 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory" (OuterVolumeSpecName: "inventory") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.442928 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.448956 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.456080 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2cf8bdc2-2981-4069-aa5c-de35a6d4a246" (UID: "2cf8bdc2-2981-4069-aa5c-de35a6d4a246"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.475931 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.475972 4895 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.475988 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.476002 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.476018 4895 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.476034 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.476047 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpz6w\" (UniqueName: \"kubernetes.io/projected/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-kube-api-access-gpz6w\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.476058 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cf8bdc2-2981-4069-aa5c-de35a6d4a246-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.747634 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" event={"ID":"2cf8bdc2-2981-4069-aa5c-de35a6d4a246","Type":"ContainerDied","Data":"f276847a1c4b7d43f9de698b4545967dd1a47772d36e78601b8f5a84a2893365"} Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.747689 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f276847a1c4b7d43f9de698b4545967dd1a47772d36e78601b8f5a84a2893365" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.748000 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-8tpkb" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858047 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-pl55w"] Dec 02 09:47:13 crc kubenswrapper[4895]: E1202 09:47:13.858700 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="registry-server" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858719 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="registry-server" Dec 02 09:47:13 crc kubenswrapper[4895]: E1202 09:47:13.858754 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf8bdc2-2981-4069-aa5c-de35a6d4a246" containerName="telemetry-openstack-openstack-cell1" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858763 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf8bdc2-2981-4069-aa5c-de35a6d4a246" containerName="telemetry-openstack-openstack-cell1" Dec 02 09:47:13 crc kubenswrapper[4895]: E1202 09:47:13.858789 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f1c442-ad57-4345-87d9-bb98a6140a45" containerName="collect-profiles" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858797 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f1c442-ad57-4345-87d9-bb98a6140a45" containerName="collect-profiles" Dec 02 09:47:13 crc kubenswrapper[4895]: E1202 09:47:13.858816 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="extract-utilities" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858824 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="extract-utilities" Dec 02 09:47:13 crc kubenswrapper[4895]: E1202 09:47:13.858841 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="extract-content" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.858848 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="extract-content" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.859121 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="733aa7b9-463c-48e5-9776-e4a4fa1ff41d" containerName="registry-server" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.859139 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f1c442-ad57-4345-87d9-bb98a6140a45" containerName="collect-profiles" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.859167 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf8bdc2-2981-4069-aa5c-de35a6d4a246" containerName="telemetry-openstack-openstack-cell1" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.860168 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.863512 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.868074 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.868642 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.868964 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.869216 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.887422 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-pl55w"] Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989144 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5496d\" (UniqueName: \"kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989433 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989571 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989722 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:13 crc kubenswrapper[4895]: I1202 09:47:13.989867 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.091560 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.091939 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5496d\" (UniqueName: \"kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.092020 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.092083 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.092119 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.092138 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.097209 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.097388 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.097830 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.099105 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.099756 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.115817 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5496d\" (UniqueName: \"kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d\") pod \"neutron-sriov-openstack-openstack-cell1-pl55w\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.209048 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.758276 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-pl55w"] Dec 02 09:47:14 crc kubenswrapper[4895]: I1202 09:47:14.760062 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:47:15 crc kubenswrapper[4895]: I1202 09:47:15.769983 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" event={"ID":"ea563647-46a1-45f7-9592-c2f1a842df06","Type":"ContainerStarted","Data":"3bc281df25a11b9c779cbc480563b6c4d3a75adb5911cf873850c7b6e15f88b6"} Dec 02 09:47:15 crc kubenswrapper[4895]: I1202 09:47:15.770369 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" event={"ID":"ea563647-46a1-45f7-9592-c2f1a842df06","Type":"ContainerStarted","Data":"cade6e1ca8fa3dbb49abcddf1edc497a64b0d6cbf670cbb208f3e4dbcd47b873"} Dec 02 09:47:15 crc kubenswrapper[4895]: I1202 09:47:15.793070 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" podStartSLOduration=2.596954667 podStartE2EDuration="2.793048943s" podCreationTimestamp="2025-12-02 09:47:13 +0000 UTC" firstStartedPulling="2025-12-02 09:47:14.759830391 +0000 UTC m=+8645.930690004" lastFinishedPulling="2025-12-02 09:47:14.955924667 +0000 UTC m=+8646.126784280" observedRunningTime="2025-12-02 09:47:15.787533231 +0000 UTC m=+8646.958392844" watchObservedRunningTime="2025-12-02 09:47:15.793048943 +0000 UTC m=+8646.963908566" Dec 02 09:47:19 crc kubenswrapper[4895]: I1202 09:47:19.148639 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:47:19 crc kubenswrapper[4895]: E1202 09:47:19.149554 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:47:31 crc kubenswrapper[4895]: I1202 09:47:31.140951 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:47:31 crc kubenswrapper[4895]: E1202 09:47:31.141565 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:47:44 crc kubenswrapper[4895]: I1202 09:47:44.141437 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:47:44 crc kubenswrapper[4895]: E1202 09:47:44.142200 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:47:56 crc kubenswrapper[4895]: I1202 09:47:56.141511 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:47:56 crc kubenswrapper[4895]: E1202 09:47:56.143814 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:48:10 crc kubenswrapper[4895]: I1202 09:48:10.141711 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:48:10 crc kubenswrapper[4895]: E1202 09:48:10.142638 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:48:24 crc kubenswrapper[4895]: I1202 09:48:24.142121 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:48:24 crc kubenswrapper[4895]: E1202 09:48:24.143180 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:48:36 crc kubenswrapper[4895]: I1202 09:48:36.141662 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:48:36 crc kubenswrapper[4895]: E1202 09:48:36.142987 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:48:50 crc kubenswrapper[4895]: I1202 09:48:50.141554 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:48:50 crc kubenswrapper[4895]: E1202 09:48:50.142388 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:48:52 crc kubenswrapper[4895]: I1202 09:48:52.684081 4895 patch_prober.go:28] interesting pod/oauth-openshift-7f8f9bcd8d-xp9f9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.58:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 09:48:52 crc kubenswrapper[4895]: I1202 09:48:52.684912 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7f8f9bcd8d-xp9f9" podUID="1f730f49-ea4f-48a2-9849-660bf2583047" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.58:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 09:49:01 crc kubenswrapper[4895]: I1202 09:49:01.141204 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:49:01 crc kubenswrapper[4895]: E1202 09:49:01.142399 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.268243 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.272965 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.281717 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.384249 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.384593 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.384954 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5cl\" (UniqueName: \"kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.487082 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.487244 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.487333 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5cl\" (UniqueName: \"kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.487614 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.488025 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.857021 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5cl\" (UniqueName: \"kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl\") pod \"certified-operators-zmr6q\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:10 crc kubenswrapper[4895]: I1202 09:49:10.918670 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:11 crc kubenswrapper[4895]: I1202 09:49:11.453775 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:11 crc kubenswrapper[4895]: W1202 09:49:11.465475 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b78978_bccc_45c7_b6c6_918d8f64d126.slice/crio-905c36f23346ce9a35a5ca5b38613ae70fb5f6dbe18854e95f4937b92c5b67a4 WatchSource:0}: Error finding container 905c36f23346ce9a35a5ca5b38613ae70fb5f6dbe18854e95f4937b92c5b67a4: Status 404 returned error can't find the container with id 905c36f23346ce9a35a5ca5b38613ae70fb5f6dbe18854e95f4937b92c5b67a4 Dec 02 09:49:12 crc kubenswrapper[4895]: I1202 09:49:12.018428 4895 generic.go:334] "Generic (PLEG): container finished" podID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerID="1b5d714af0fca60bed3b4cf65f6d4ca09b6bff5017ab34e848e308e0c0e79efb" exitCode=0 Dec 02 09:49:12 crc kubenswrapper[4895]: I1202 09:49:12.018504 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerDied","Data":"1b5d714af0fca60bed3b4cf65f6d4ca09b6bff5017ab34e848e308e0c0e79efb"} Dec 02 09:49:12 crc kubenswrapper[4895]: I1202 09:49:12.019176 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerStarted","Data":"905c36f23346ce9a35a5ca5b38613ae70fb5f6dbe18854e95f4937b92c5b67a4"} Dec 02 09:49:14 crc kubenswrapper[4895]: I1202 09:49:14.040423 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerStarted","Data":"3fca72dfa16de515cef35e98e2e0e2bc639e8926320d5bcbe0eac6edd49f4acf"} Dec 02 09:49:15 crc kubenswrapper[4895]: I1202 09:49:15.061459 4895 generic.go:334] "Generic (PLEG): container finished" podID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerID="3fca72dfa16de515cef35e98e2e0e2bc639e8926320d5bcbe0eac6edd49f4acf" exitCode=0 Dec 02 09:49:15 crc kubenswrapper[4895]: I1202 09:49:15.061882 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerDied","Data":"3fca72dfa16de515cef35e98e2e0e2bc639e8926320d5bcbe0eac6edd49f4acf"} Dec 02 09:49:15 crc kubenswrapper[4895]: I1202 09:49:15.142454 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:49:16 crc kubenswrapper[4895]: I1202 09:49:16.087561 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a"} Dec 02 09:49:16 crc kubenswrapper[4895]: I1202 09:49:16.094940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerStarted","Data":"c8bde5474e81f558512021255804fb4d5f87d39bfe740d4cb7ace68691bbf8e0"} Dec 02 09:49:16 crc kubenswrapper[4895]: I1202 09:49:16.153534 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zmr6q" podStartSLOduration=2.663903603 podStartE2EDuration="6.15351355s" podCreationTimestamp="2025-12-02 09:49:10 +0000 UTC" firstStartedPulling="2025-12-02 09:49:12.02083063 +0000 UTC m=+8763.191690243" lastFinishedPulling="2025-12-02 09:49:15.510440577 +0000 UTC m=+8766.681300190" observedRunningTime="2025-12-02 09:49:16.137524732 +0000 UTC m=+8767.308384355" watchObservedRunningTime="2025-12-02 09:49:16.15351355 +0000 UTC m=+8767.324373163" Dec 02 09:49:20 crc kubenswrapper[4895]: I1202 09:49:20.919832 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:20 crc kubenswrapper[4895]: I1202 09:49:20.920550 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:20 crc kubenswrapper[4895]: I1202 09:49:20.976373 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:21 crc kubenswrapper[4895]: I1202 09:49:21.191617 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:21 crc kubenswrapper[4895]: I1202 09:49:21.246723 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:23 crc kubenswrapper[4895]: I1202 09:49:23.171233 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zmr6q" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="registry-server" containerID="cri-o://c8bde5474e81f558512021255804fb4d5f87d39bfe740d4cb7ace68691bbf8e0" gracePeriod=2 Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.187669 4895 generic.go:334] "Generic (PLEG): container finished" podID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerID="c8bde5474e81f558512021255804fb4d5f87d39bfe740d4cb7ace68691bbf8e0" exitCode=0 Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.187754 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerDied","Data":"c8bde5474e81f558512021255804fb4d5f87d39bfe740d4cb7ace68691bbf8e0"} Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.870373 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.925402 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content\") pod \"e1b78978-bccc-45c7-b6c6-918d8f64d126\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.925512 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg5cl\" (UniqueName: \"kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl\") pod \"e1b78978-bccc-45c7-b6c6-918d8f64d126\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.925557 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities\") pod \"e1b78978-bccc-45c7-b6c6-918d8f64d126\" (UID: \"e1b78978-bccc-45c7-b6c6-918d8f64d126\") " Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.926613 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities" (OuterVolumeSpecName: "utilities") pod "e1b78978-bccc-45c7-b6c6-918d8f64d126" (UID: "e1b78978-bccc-45c7-b6c6-918d8f64d126"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.936364 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl" (OuterVolumeSpecName: "kube-api-access-sg5cl") pod "e1b78978-bccc-45c7-b6c6-918d8f64d126" (UID: "e1b78978-bccc-45c7-b6c6-918d8f64d126"). InnerVolumeSpecName "kube-api-access-sg5cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:24 crc kubenswrapper[4895]: I1202 09:49:24.996878 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1b78978-bccc-45c7-b6c6-918d8f64d126" (UID: "e1b78978-bccc-45c7-b6c6-918d8f64d126"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.027831 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.027871 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg5cl\" (UniqueName: \"kubernetes.io/projected/e1b78978-bccc-45c7-b6c6-918d8f64d126-kube-api-access-sg5cl\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.027886 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b78978-bccc-45c7-b6c6-918d8f64d126-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.210607 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmr6q" event={"ID":"e1b78978-bccc-45c7-b6c6-918d8f64d126","Type":"ContainerDied","Data":"905c36f23346ce9a35a5ca5b38613ae70fb5f6dbe18854e95f4937b92c5b67a4"} Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.210671 4895 scope.go:117] "RemoveContainer" containerID="c8bde5474e81f558512021255804fb4d5f87d39bfe740d4cb7ace68691bbf8e0" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.210774 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmr6q" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.249536 4895 scope.go:117] "RemoveContainer" containerID="3fca72dfa16de515cef35e98e2e0e2bc639e8926320d5bcbe0eac6edd49f4acf" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.252272 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.269155 4895 scope.go:117] "RemoveContainer" containerID="1b5d714af0fca60bed3b4cf65f6d4ca09b6bff5017ab34e848e308e0c0e79efb" Dec 02 09:49:25 crc kubenswrapper[4895]: I1202 09:49:25.272697 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zmr6q"] Dec 02 09:49:27 crc kubenswrapper[4895]: I1202 09:49:27.154420 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" path="/var/lib/kubelet/pods/e1b78978-bccc-45c7-b6c6-918d8f64d126/volumes" Dec 02 09:49:27 crc kubenswrapper[4895]: I1202 09:49:27.232135 4895 generic.go:334] "Generic (PLEG): container finished" podID="ea563647-46a1-45f7-9592-c2f1a842df06" containerID="3bc281df25a11b9c779cbc480563b6c4d3a75adb5911cf873850c7b6e15f88b6" exitCode=0 Dec 02 09:49:27 crc kubenswrapper[4895]: I1202 09:49:27.232184 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" event={"ID":"ea563647-46a1-45f7-9592-c2f1a842df06","Type":"ContainerDied","Data":"3bc281df25a11b9c779cbc480563b6c4d3a75adb5911cf873850c7b6e15f88b6"} Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.753613 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.812846 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5496d\" (UniqueName: \"kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.813207 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.813268 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.813375 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.813478 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.813512 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph\") pod \"ea563647-46a1-45f7-9592-c2f1a842df06\" (UID: \"ea563647-46a1-45f7-9592-c2f1a842df06\") " Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.823803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d" (OuterVolumeSpecName: "kube-api-access-5496d") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "kube-api-access-5496d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.834173 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.837861 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph" (OuterVolumeSpecName: "ceph") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.856803 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.860033 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.870906 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory" (OuterVolumeSpecName: "inventory") pod "ea563647-46a1-45f7-9592-c2f1a842df06" (UID: "ea563647-46a1-45f7-9592-c2f1a842df06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924537 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924581 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924596 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924612 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924627 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5496d\" (UniqueName: \"kubernetes.io/projected/ea563647-46a1-45f7-9592-c2f1a842df06-kube-api-access-5496d\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:28 crc kubenswrapper[4895]: I1202 09:49:28.924642 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ea563647-46a1-45f7-9592-c2f1a842df06-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.252869 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" event={"ID":"ea563647-46a1-45f7-9592-c2f1a842df06","Type":"ContainerDied","Data":"cade6e1ca8fa3dbb49abcddf1edc497a64b0d6cbf670cbb208f3e4dbcd47b873"} Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.252922 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cade6e1ca8fa3dbb49abcddf1edc497a64b0d6cbf670cbb208f3e4dbcd47b873" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.253003 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-pl55w" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.367802 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d"] Dec 02 09:49:29 crc kubenswrapper[4895]: E1202 09:49:29.368994 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea563647-46a1-45f7-9592-c2f1a842df06" containerName="neutron-sriov-openstack-openstack-cell1" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.369090 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea563647-46a1-45f7-9592-c2f1a842df06" containerName="neutron-sriov-openstack-openstack-cell1" Dec 02 09:49:29 crc kubenswrapper[4895]: E1202 09:49:29.369172 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="extract-content" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.369244 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="extract-content" Dec 02 09:49:29 crc kubenswrapper[4895]: E1202 09:49:29.369316 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="extract-utilities" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.369369 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="extract-utilities" Dec 02 09:49:29 crc kubenswrapper[4895]: E1202 09:49:29.369479 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="registry-server" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.369532 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="registry-server" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.369939 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b78978-bccc-45c7-b6c6-918d8f64d126" containerName="registry-server" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.370038 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea563647-46a1-45f7-9592-c2f1a842df06" containerName="neutron-sriov-openstack-openstack-cell1" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.371409 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.375477 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.375895 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.376117 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.376935 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.380808 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.385133 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d"] Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439218 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439308 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439370 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439563 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf5q9\" (UniqueName: \"kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439639 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.439686 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.542183 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.542280 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.542315 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.542408 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf5q9\" (UniqueName: \"kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.543095 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.543161 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.547389 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.547527 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.548924 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.549278 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.549385 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.567087 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf5q9\" (UniqueName: \"kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9\") pod \"neutron-dhcp-openstack-openstack-cell1-qpf4d\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:29 crc kubenswrapper[4895]: I1202 09:49:29.691042 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:49:30 crc kubenswrapper[4895]: I1202 09:49:30.368248 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d"] Dec 02 09:49:30 crc kubenswrapper[4895]: W1202 09:49:30.368502 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40de1ee8_5c68_4155_86f4_55152e72d07e.slice/crio-f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf WatchSource:0}: Error finding container f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf: Status 404 returned error can't find the container with id f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf Dec 02 09:49:31 crc kubenswrapper[4895]: I1202 09:49:31.275789 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" event={"ID":"40de1ee8-5c68-4155-86f4-55152e72d07e","Type":"ContainerStarted","Data":"7f88d78e238c208b0894f96ab33dbce8cc282d7709a317409c9c610f81a764e1"} Dec 02 09:49:31 crc kubenswrapper[4895]: I1202 09:49:31.276175 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" event={"ID":"40de1ee8-5c68-4155-86f4-55152e72d07e","Type":"ContainerStarted","Data":"f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf"} Dec 02 09:49:31 crc kubenswrapper[4895]: I1202 09:49:31.299563 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" podStartSLOduration=2.016739468 podStartE2EDuration="2.299524853s" podCreationTimestamp="2025-12-02 09:49:29 +0000 UTC" firstStartedPulling="2025-12-02 09:49:30.371315452 +0000 UTC m=+8781.542175065" lastFinishedPulling="2025-12-02 09:49:30.654100837 +0000 UTC m=+8781.824960450" observedRunningTime="2025-12-02 09:49:31.296411036 +0000 UTC m=+8782.467270659" watchObservedRunningTime="2025-12-02 09:49:31.299524853 +0000 UTC m=+8782.470384476" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.007469 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.010611 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.020714 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.178685 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.179117 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9pj\" (UniqueName: \"kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.179239 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.281099 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.281237 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9pj\" (UniqueName: \"kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.281276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.281647 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.281876 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.301957 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9pj\" (UniqueName: \"kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj\") pod \"redhat-operators-wdnkf\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.341491 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:25 crc kubenswrapper[4895]: I1202 09:50:25.881639 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:50:26 crc kubenswrapper[4895]: I1202 09:50:26.869027 4895 generic.go:334] "Generic (PLEG): container finished" podID="5bcf23f4-9c15-4555-a978-481160374d4b" containerID="c7b3c40bf91c7e828075dd78d6c11e83c46356928884f7cf52c5b3c7f9d084d9" exitCode=0 Dec 02 09:50:26 crc kubenswrapper[4895]: I1202 09:50:26.869134 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerDied","Data":"c7b3c40bf91c7e828075dd78d6c11e83c46356928884f7cf52c5b3c7f9d084d9"} Dec 02 09:50:26 crc kubenswrapper[4895]: I1202 09:50:26.870172 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerStarted","Data":"9bb949adac0976db3a184f64f29b251a04274288b648c52f17ddc714a1b05d29"} Dec 02 09:50:29 crc kubenswrapper[4895]: I1202 09:50:29.898553 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerStarted","Data":"60955e76b9ae44a2b2af220ca17f0704d258b18bc11cd1bf652a395449af1d35"} Dec 02 09:50:33 crc kubenswrapper[4895]: I1202 09:50:33.949347 4895 generic.go:334] "Generic (PLEG): container finished" podID="5bcf23f4-9c15-4555-a978-481160374d4b" containerID="60955e76b9ae44a2b2af220ca17f0704d258b18bc11cd1bf652a395449af1d35" exitCode=0 Dec 02 09:50:33 crc kubenswrapper[4895]: I1202 09:50:33.949478 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerDied","Data":"60955e76b9ae44a2b2af220ca17f0704d258b18bc11cd1bf652a395449af1d35"} Dec 02 09:50:36 crc kubenswrapper[4895]: I1202 09:50:36.984817 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerStarted","Data":"ee34f17d4b351f769693e377522b1329f6ba165438405acd911b575a7859bb2b"} Dec 02 09:50:37 crc kubenswrapper[4895]: I1202 09:50:37.018067 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdnkf" podStartSLOduration=3.220981501 podStartE2EDuration="13.018047383s" podCreationTimestamp="2025-12-02 09:50:24 +0000 UTC" firstStartedPulling="2025-12-02 09:50:26.871793768 +0000 UTC m=+8838.042653381" lastFinishedPulling="2025-12-02 09:50:36.66885965 +0000 UTC m=+8847.839719263" observedRunningTime="2025-12-02 09:50:37.012960894 +0000 UTC m=+8848.183820517" watchObservedRunningTime="2025-12-02 09:50:37.018047383 +0000 UTC m=+8848.188906996" Dec 02 09:50:45 crc kubenswrapper[4895]: I1202 09:50:45.342452 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:45 crc kubenswrapper[4895]: I1202 09:50:45.343355 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:46 crc kubenswrapper[4895]: I1202 09:50:46.397489 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wdnkf" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="registry-server" probeResult="failure" output=< Dec 02 09:50:46 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:50:46 crc kubenswrapper[4895]: > Dec 02 09:50:55 crc kubenswrapper[4895]: I1202 09:50:55.992305 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:56 crc kubenswrapper[4895]: I1202 09:50:56.048360 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:56 crc kubenswrapper[4895]: I1202 09:50:56.231333 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:50:57 crc kubenswrapper[4895]: I1202 09:50:57.193811 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wdnkf" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="registry-server" containerID="cri-o://ee34f17d4b351f769693e377522b1329f6ba165438405acd911b575a7859bb2b" gracePeriod=2 Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.206420 4895 generic.go:334] "Generic (PLEG): container finished" podID="5bcf23f4-9c15-4555-a978-481160374d4b" containerID="ee34f17d4b351f769693e377522b1329f6ba165438405acd911b575a7859bb2b" exitCode=0 Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.206512 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerDied","Data":"ee34f17d4b351f769693e377522b1329f6ba165438405acd911b575a7859bb2b"} Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.206843 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdnkf" event={"ID":"5bcf23f4-9c15-4555-a978-481160374d4b","Type":"ContainerDied","Data":"9bb949adac0976db3a184f64f29b251a04274288b648c52f17ddc714a1b05d29"} Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.206863 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb949adac0976db3a184f64f29b251a04274288b648c52f17ddc714a1b05d29" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.217219 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.357286 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities\") pod \"5bcf23f4-9c15-4555-a978-481160374d4b\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.357690 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x9pj\" (UniqueName: \"kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj\") pod \"5bcf23f4-9c15-4555-a978-481160374d4b\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.357793 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content\") pod \"5bcf23f4-9c15-4555-a978-481160374d4b\" (UID: \"5bcf23f4-9c15-4555-a978-481160374d4b\") " Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.358902 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities" (OuterVolumeSpecName: "utilities") pod "5bcf23f4-9c15-4555-a978-481160374d4b" (UID: "5bcf23f4-9c15-4555-a978-481160374d4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.364028 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj" (OuterVolumeSpecName: "kube-api-access-6x9pj") pod "5bcf23f4-9c15-4555-a978-481160374d4b" (UID: "5bcf23f4-9c15-4555-a978-481160374d4b"). InnerVolumeSpecName "kube-api-access-6x9pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.460761 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.460810 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x9pj\" (UniqueName: \"kubernetes.io/projected/5bcf23f4-9c15-4555-a978-481160374d4b-kube-api-access-6x9pj\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.483546 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bcf23f4-9c15-4555-a978-481160374d4b" (UID: "5bcf23f4-9c15-4555-a978-481160374d4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:50:58 crc kubenswrapper[4895]: I1202 09:50:58.563276 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf23f4-9c15-4555-a978-481160374d4b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:50:59 crc kubenswrapper[4895]: I1202 09:50:59.225681 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdnkf" Dec 02 09:50:59 crc kubenswrapper[4895]: I1202 09:50:59.259074 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:50:59 crc kubenswrapper[4895]: I1202 09:50:59.270300 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wdnkf"] Dec 02 09:51:01 crc kubenswrapper[4895]: I1202 09:51:01.153425 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" path="/var/lib/kubelet/pods/5bcf23f4-9c15-4555-a978-481160374d4b/volumes" Dec 02 09:51:18 crc kubenswrapper[4895]: I1202 09:51:18.427241 4895 generic.go:334] "Generic (PLEG): container finished" podID="40de1ee8-5c68-4155-86f4-55152e72d07e" containerID="7f88d78e238c208b0894f96ab33dbce8cc282d7709a317409c9c610f81a764e1" exitCode=0 Dec 02 09:51:18 crc kubenswrapper[4895]: I1202 09:51:18.427359 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" event={"ID":"40de1ee8-5c68-4155-86f4-55152e72d07e","Type":"ContainerDied","Data":"7f88d78e238c208b0894f96ab33dbce8cc282d7709a317409c9c610f81a764e1"} Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.861026 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930412 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930547 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930623 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930655 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf5q9\" (UniqueName: \"kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930755 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.930867 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.942604 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph" (OuterVolumeSpecName: "ceph") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.947862 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.951083 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9" (OuterVolumeSpecName: "kube-api-access-tf5q9") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "kube-api-access-tf5q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:19 crc kubenswrapper[4895]: E1202 09:51:19.959248 4895 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0 podName:40de1ee8-5c68-4155-86f4-55152e72d07e nodeName:}" failed. No retries permitted until 2025-12-02 09:51:20.4592072 +0000 UTC m=+8891.630066813 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "neutron-dhcp-agent-neutron-config-0" (UniqueName: "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e") : error deleting /var/lib/kubelet/pods/40de1ee8-5c68-4155-86f4-55152e72d07e/volume-subpaths: remove /var/lib/kubelet/pods/40de1ee8-5c68-4155-86f4-55152e72d07e/volume-subpaths: no such file or directory Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.960887 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:19 crc kubenswrapper[4895]: I1202 09:51:19.961484 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory" (OuterVolumeSpecName: "inventory") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.034686 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.034731 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.034775 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.034787 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.034800 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf5q9\" (UniqueName: \"kubernetes.io/projected/40de1ee8-5c68-4155-86f4-55152e72d07e-kube-api-access-tf5q9\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.447148 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" event={"ID":"40de1ee8-5c68-4155-86f4-55152e72d07e","Type":"ContainerDied","Data":"f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf"} Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.447796 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d749d0b6520b336c9086d374fb467e89b4a32b172126c6f828e276437d5bcf" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.447219 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qpf4d" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.544983 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") pod \"40de1ee8-5c68-4155-86f4-55152e72d07e\" (UID: \"40de1ee8-5c68-4155-86f4-55152e72d07e\") " Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.550204 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "40de1ee8-5c68-4155-86f4-55152e72d07e" (UID: "40de1ee8-5c68-4155-86f4-55152e72d07e"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:20 crc kubenswrapper[4895]: I1202 09:51:20.648231 4895 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/40de1ee8-5c68-4155-86f4-55152e72d07e-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:35 crc kubenswrapper[4895]: I1202 09:51:35.473392 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:51:35 crc kubenswrapper[4895]: I1202 09:51:35.474063 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.851978 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.852859 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="fdd34d7c-19d4-482a-aa19-6535eb26640e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165" gracePeriod=30 Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.869706 4895 scope.go:117] "RemoveContainer" containerID="39cce3721e416746ce3e9c7128a2f483c3a52201ce0f93ed49df36b75fdbef31" Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.877440 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.877935 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" gracePeriod=30 Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.895947 4895 scope.go:117] "RemoveContainer" containerID="cc2cdd3864e0ca17c0b5d3b3f85a9a518769e2558e27f0916d9d51fb1ed35341" Dec 02 09:51:39 crc kubenswrapper[4895]: I1202 09:51:39.920862 4895 scope.go:117] "RemoveContainer" containerID="10fcee9619c406913331fef9138290c70a50291f959319667320a8c92091b7b2" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.574053 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.574641 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" containerName="nova-scheduler-scheduler" containerID="cri-o://afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" gracePeriod=30 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.589058 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.589292 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-log" containerID="cri-o://0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee" gracePeriod=30 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.589433 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-api" containerID="cri-o://c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0" gracePeriod=30 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.636938 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.637326 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" containerID="cri-o://9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a" gracePeriod=30 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.637326 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" containerID="cri-o://a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378" gracePeriod=30 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.701124 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz"] Dec 02 09:51:40 crc kubenswrapper[4895]: E1202 09:51:40.703976 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="extract-utilities" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704009 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="extract-utilities" Dec 02 09:51:40 crc kubenswrapper[4895]: E1202 09:51:40.704042 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40de1ee8-5c68-4155-86f4-55152e72d07e" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704052 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="40de1ee8-5c68-4155-86f4-55152e72d07e" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 02 09:51:40 crc kubenswrapper[4895]: E1202 09:51:40.704077 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="registry-server" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704085 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="registry-server" Dec 02 09:51:40 crc kubenswrapper[4895]: E1202 09:51:40.704103 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="extract-content" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704111 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="extract-content" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704395 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="40de1ee8-5c68-4155-86f4-55152e72d07e" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.704439 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcf23f4-9c15-4555-a978-481160374d4b" containerName="registry-server" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.705256 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.711027 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.711064 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.711227 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brvc6" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.721857 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz"] Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.737216 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.737505 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.737768 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.737889 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766155 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766225 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766289 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766379 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766418 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766460 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766568 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766600 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766679 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c86pw\" (UniqueName: \"kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.766715 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.857892 4895 generic.go:334] "Generic (PLEG): container finished" podID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerID="0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee" exitCode=143 Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.857945 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerDied","Data":"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee"} Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870054 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c86pw\" (UniqueName: \"kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870124 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870181 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870203 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870249 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870304 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870324 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870348 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870385 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870411 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.870433 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.879678 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.886787 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.887392 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.887792 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.887869 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.889591 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.891318 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.897374 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.908183 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.910264 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:40 crc kubenswrapper[4895]: I1202 09:51:40.918098 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c86pw\" (UniqueName: \"kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.145459 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.833503 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz"] Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.870882 4895 generic.go:334] "Generic (PLEG): container finished" podID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerID="9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a" exitCode=143 Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.871011 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerDied","Data":"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a"} Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.872908 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" event={"ID":"427eea9a-0bfb-4a1a-a225-c4264018fd13","Type":"ContainerStarted","Data":"943008a52d482ca4466ba793a4b33e3ecda0fe9003e58305af26af81fb4c1996"} Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.874395 4895 generic.go:334] "Generic (PLEG): container finished" podID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerID="a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" exitCode=0 Dec 02 09:51:41 crc kubenswrapper[4895]: I1202 09:51:41.874419 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23d0ddc1-b566-4537-ac82-544ff5e098f3","Type":"ContainerDied","Data":"a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5"} Dec 02 09:51:41 crc kubenswrapper[4895]: E1202 09:51:41.933633 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5 is running failed: container process not found" containerID="a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 09:51:41 crc kubenswrapper[4895]: E1202 09:51:41.933966 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5 is running failed: container process not found" containerID="a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 09:51:41 crc kubenswrapper[4895]: E1202 09:51:41.934518 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5 is running failed: container process not found" containerID="a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 02 09:51:41 crc kubenswrapper[4895]: E1202 09:51:41.934609 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerName="nova-cell1-conductor-conductor" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.252777 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.349003 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.389915 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wsft\" (UniqueName: \"kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft\") pod \"23d0ddc1-b566-4537-ac82-544ff5e098f3\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.389965 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle\") pod \"23d0ddc1-b566-4537-ac82-544ff5e098f3\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.389993 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data\") pod \"23d0ddc1-b566-4537-ac82-544ff5e098f3\" (UID: \"23d0ddc1-b566-4537-ac82-544ff5e098f3\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.399687 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft" (OuterVolumeSpecName: "kube-api-access-7wsft") pod "23d0ddc1-b566-4537-ac82-544ff5e098f3" (UID: "23d0ddc1-b566-4537-ac82-544ff5e098f3"). InnerVolumeSpecName "kube-api-access-7wsft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.425184 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data" (OuterVolumeSpecName: "config-data") pod "23d0ddc1-b566-4537-ac82-544ff5e098f3" (UID: "23d0ddc1-b566-4537-ac82-544ff5e098f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.425227 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23d0ddc1-b566-4537-ac82-544ff5e098f3" (UID: "23d0ddc1-b566-4537-ac82-544ff5e098f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.493225 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data\") pod \"fdd34d7c-19d4-482a-aa19-6535eb26640e\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.493483 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjplv\" (UniqueName: \"kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv\") pod \"fdd34d7c-19d4-482a-aa19-6535eb26640e\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.493626 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle\") pod \"fdd34d7c-19d4-482a-aa19-6535eb26640e\" (UID: \"fdd34d7c-19d4-482a-aa19-6535eb26640e\") " Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.494216 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wsft\" (UniqueName: \"kubernetes.io/projected/23d0ddc1-b566-4537-ac82-544ff5e098f3-kube-api-access-7wsft\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.494240 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.494250 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23d0ddc1-b566-4537-ac82-544ff5e098f3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.496500 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv" (OuterVolumeSpecName: "kube-api-access-rjplv") pod "fdd34d7c-19d4-482a-aa19-6535eb26640e" (UID: "fdd34d7c-19d4-482a-aa19-6535eb26640e"). InnerVolumeSpecName "kube-api-access-rjplv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.523946 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data" (OuterVolumeSpecName: "config-data") pod "fdd34d7c-19d4-482a-aa19-6535eb26640e" (UID: "fdd34d7c-19d4-482a-aa19-6535eb26640e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.524434 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdd34d7c-19d4-482a-aa19-6535eb26640e" (UID: "fdd34d7c-19d4-482a-aa19-6535eb26640e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.596066 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.596099 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjplv\" (UniqueName: \"kubernetes.io/projected/fdd34d7c-19d4-482a-aa19-6535eb26640e-kube-api-access-rjplv\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.596114 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd34d7c-19d4-482a-aa19-6535eb26640e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.886648 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.886638 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23d0ddc1-b566-4537-ac82-544ff5e098f3","Type":"ContainerDied","Data":"e7c782982b25b7900359708bc99c24aa774cee3a6981286caa4574be111b85e6"} Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.887161 4895 scope.go:117] "RemoveContainer" containerID="a6f2da1b37d403b45d1abfda3e00d6af0e663675c9da018a034ae3293bda12d5" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.889528 4895 generic.go:334] "Generic (PLEG): container finished" podID="fdd34d7c-19d4-482a-aa19-6535eb26640e" containerID="f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165" exitCode=0 Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.889582 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.889596 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdd34d7c-19d4-482a-aa19-6535eb26640e","Type":"ContainerDied","Data":"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165"} Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.889622 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdd34d7c-19d4-482a-aa19-6535eb26640e","Type":"ContainerDied","Data":"681df1dacdccaf86f1c7fca2c564c4f347cf8eff132fa5628617ca71fa0a8379"} Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.896640 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" event={"ID":"427eea9a-0bfb-4a1a-a225-c4264018fd13","Type":"ContainerStarted","Data":"d08d1f1298efd9dca94b4f3c64c6ba23a878075628e619075fca941893bbc1ff"} Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.927956 4895 scope.go:117] "RemoveContainer" containerID="f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.927926 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" podStartSLOduration=2.716812959 podStartE2EDuration="2.927906121s" podCreationTimestamp="2025-12-02 09:51:40 +0000 UTC" firstStartedPulling="2025-12-02 09:51:41.835686372 +0000 UTC m=+8913.006545985" lastFinishedPulling="2025-12-02 09:51:42.046779534 +0000 UTC m=+8913.217639147" observedRunningTime="2025-12-02 09:51:42.917428024 +0000 UTC m=+8914.088287637" watchObservedRunningTime="2025-12-02 09:51:42.927906121 +0000 UTC m=+8914.098765734" Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.970341 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:42 crc kubenswrapper[4895]: I1202 09:51:42.984875 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.012848 4895 scope.go:117] "RemoveContainer" containerID="f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.017586 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: E1202 09:51:43.018235 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165\": container with ID starting with f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165 not found: ID does not exist" containerID="f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.018289 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165"} err="failed to get container status \"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165\": rpc error: code = NotFound desc = could not find container \"f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165\": container with ID starting with f7efcba9302ac8e05dd7acec27d676a85be1fff02ac30f14281f93658c884165 not found: ID does not exist" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.026155 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.035103 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: E1202 09:51:43.035660 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerName="nova-cell1-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.035687 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerName="nova-cell1-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: E1202 09:51:43.035696 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd34d7c-19d4-482a-aa19-6535eb26640e" containerName="nova-cell0-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.035703 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd34d7c-19d4-482a-aa19-6535eb26640e" containerName="nova-cell0-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.035972 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" containerName="nova-cell1-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.036006 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd34d7c-19d4-482a-aa19-6535eb26640e" containerName="nova-cell0-conductor-conductor" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.038054 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.041778 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.046973 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.062292 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.064138 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.068000 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.074894 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.159767 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d0ddc1-b566-4537-ac82-544ff5e098f3" path="/var/lib/kubelet/pods/23d0ddc1-b566-4537-ac82-544ff5e098f3/volumes" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.160338 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd34d7c-19d4-482a-aa19-6535eb26640e" path="/var/lib/kubelet/pods/fdd34d7c-19d4-482a-aa19-6535eb26640e/volumes" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.209190 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gqq\" (UniqueName: \"kubernetes.io/projected/e926a7ab-fc54-4c41-9f38-65187a742aac-kube-api-access-s7gqq\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.209265 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.209522 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.209723 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.209908 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.210132 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4lw\" (UniqueName: \"kubernetes.io/projected/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-kube-api-access-hp4lw\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312011 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4lw\" (UniqueName: \"kubernetes.io/projected/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-kube-api-access-hp4lw\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312121 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gqq\" (UniqueName: \"kubernetes.io/projected/e926a7ab-fc54-4c41-9f38-65187a742aac-kube-api-access-s7gqq\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312152 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312223 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312276 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.312320 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.317346 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.318629 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.321638 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.325371 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e926a7ab-fc54-4c41-9f38-65187a742aac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.334978 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gqq\" (UniqueName: \"kubernetes.io/projected/e926a7ab-fc54-4c41-9f38-65187a742aac-kube-api-access-s7gqq\") pod \"nova-cell1-conductor-0\" (UID: \"e926a7ab-fc54-4c41-9f38-65187a742aac\") " pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.337561 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4lw\" (UniqueName: \"kubernetes.io/projected/f1ca8c61-df4d-45db-850a-7bd7dcd1eb70-kube-api-access-hp4lw\") pod \"nova-cell0-conductor-0\" (UID: \"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70\") " pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.370076 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.385254 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.879031 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: W1202 09:51:43.883344 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1ca8c61_df4d_45db_850a_7bd7dcd1eb70.slice/crio-d026efa44e4765126cfd868f37a0c0d6783661c1a86ba0c079ac0ae537c8bc46 WatchSource:0}: Error finding container d026efa44e4765126cfd868f37a0c0d6783661c1a86ba0c079ac0ae537c8bc46: Status 404 returned error can't find the container with id d026efa44e4765126cfd868f37a0c0d6783661c1a86ba0c079ac0ae537c8bc46 Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.912701 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70","Type":"ContainerStarted","Data":"d026efa44e4765126cfd868f37a0c0d6783661c1a86ba0c079ac0ae537c8bc46"} Dec 02 09:51:43 crc kubenswrapper[4895]: I1202 09:51:43.975887 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 09:51:43 crc kubenswrapper[4895]: W1202 09:51:43.980309 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode926a7ab_fc54_4c41_9f38_65187a742aac.slice/crio-cd7a5bbb011be82152cf9369801a9d655e57737a8364f8746276bc547c0fe7a4 WatchSource:0}: Error finding container cd7a5bbb011be82152cf9369801a9d655e57737a8364f8746276bc547c0fe7a4: Status 404 returned error can't find the container with id cd7a5bbb011be82152cf9369801a9d655e57737a8364f8746276bc547c0fe7a4 Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.924030 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f1ca8c61-df4d-45db-850a-7bd7dcd1eb70","Type":"ContainerStarted","Data":"04e175f4917718760ca66ef94da5df08629b66d72b77ff7d5ea8d9cf173b22d1"} Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.924366 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.925899 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e926a7ab-fc54-4c41-9f38-65187a742aac","Type":"ContainerStarted","Data":"56b033d2c780282e45435417b9886272f0c7541e7c5cb9354d83291dad8e1eb0"} Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.925940 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e926a7ab-fc54-4c41-9f38-65187a742aac","Type":"ContainerStarted","Data":"cd7a5bbb011be82152cf9369801a9d655e57737a8364f8746276bc547c0fe7a4"} Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.926052 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.950106 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.950084976 podStartE2EDuration="2.950084976s" podCreationTimestamp="2025-12-02 09:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:51:44.938797205 +0000 UTC m=+8916.109656818" watchObservedRunningTime="2025-12-02 09:51:44.950084976 +0000 UTC m=+8916.120944609" Dec 02 09:51:44 crc kubenswrapper[4895]: I1202 09:51:44.956797 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.956778805 podStartE2EDuration="2.956778805s" podCreationTimestamp="2025-12-02 09:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:51:44.955929638 +0000 UTC m=+8916.126789261" watchObservedRunningTime="2025-12-02 09:51:44.956778805 +0000 UTC m=+8916.127638428" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.085834 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:41898->10.217.1.83:8775: read: connection reset by peer" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.085917 4895 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:41900->10.217.1.83:8775: read: connection reset by peer" Dec 02 09:51:45 crc kubenswrapper[4895]: E1202 09:51:45.533298 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17 is running failed: container process not found" containerID="afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:51:45 crc kubenswrapper[4895]: E1202 09:51:45.534201 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17 is running failed: container process not found" containerID="afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:51:45 crc kubenswrapper[4895]: E1202 09:51:45.534703 4895 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17 is running failed: container process not found" containerID="afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 09:51:45 crc kubenswrapper[4895]: E1202 09:51:45.534767 4895 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" containerName="nova-scheduler-scheduler" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.872276 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.877428 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.947656 4895 generic.go:334] "Generic (PLEG): container finished" podID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerID="c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0" exitCode=0 Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.948150 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.948209 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerDied","Data":"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0"} Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.950535 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa8c76cd-9852-45cc-82fc-c9ee472f94c2","Type":"ContainerDied","Data":"da4085bc700e7413f53199ac4762b1ca913c75acf7a03e66d3a7767954baaaf4"} Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.950619 4895 scope.go:117] "RemoveContainer" containerID="c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.966043 4895 generic.go:334] "Generic (PLEG): container finished" podID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerID="a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378" exitCode=0 Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.966110 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerDied","Data":"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378"} Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.966138 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f364e779-d2db-4f23-bc99-1d0b91dca426","Type":"ContainerDied","Data":"e5f0e2d8cd570ae1c7cd4b702575e3dd18caf316712e675b1804031ff1efc305"} Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.966196 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984360 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs\") pod \"f364e779-d2db-4f23-bc99-1d0b91dca426\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984470 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kx86\" (UniqueName: \"kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86\") pod \"f364e779-d2db-4f23-bc99-1d0b91dca426\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984552 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle\") pod \"f364e779-d2db-4f23-bc99-1d0b91dca426\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984681 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data\") pod \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984760 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle\") pod \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984789 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs\") pod \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984805 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r58l\" (UniqueName: \"kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l\") pod \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\" (UID: \"aa8c76cd-9852-45cc-82fc-c9ee472f94c2\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.984862 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data\") pod \"f364e779-d2db-4f23-bc99-1d0b91dca426\" (UID: \"f364e779-d2db-4f23-bc99-1d0b91dca426\") " Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.987766 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs" (OuterVolumeSpecName: "logs") pod "f364e779-d2db-4f23-bc99-1d0b91dca426" (UID: "f364e779-d2db-4f23-bc99-1d0b91dca426"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:51:45 crc kubenswrapper[4895]: I1202 09:51:45.988739 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs" (OuterVolumeSpecName: "logs") pod "aa8c76cd-9852-45cc-82fc-c9ee472f94c2" (UID: "aa8c76cd-9852-45cc-82fc-c9ee472f94c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.000276 4895 generic.go:334] "Generic (PLEG): container finished" podID="807ab313-d84c-4059-aa53-4d99c8c65192" containerID="afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" exitCode=0 Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.000601 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"807ab313-d84c-4059-aa53-4d99c8c65192","Type":"ContainerDied","Data":"afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17"} Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.001706 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l" (OuterVolumeSpecName: "kube-api-access-2r58l") pod "aa8c76cd-9852-45cc-82fc-c9ee472f94c2" (UID: "aa8c76cd-9852-45cc-82fc-c9ee472f94c2"). InnerVolumeSpecName "kube-api-access-2r58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.005490 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86" (OuterVolumeSpecName: "kube-api-access-9kx86") pod "f364e779-d2db-4f23-bc99-1d0b91dca426" (UID: "f364e779-d2db-4f23-bc99-1d0b91dca426"). InnerVolumeSpecName "kube-api-access-9kx86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.021600 4895 scope.go:117] "RemoveContainer" containerID="0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.022346 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa8c76cd-9852-45cc-82fc-c9ee472f94c2" (UID: "aa8c76cd-9852-45cc-82fc-c9ee472f94c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.027705 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data" (OuterVolumeSpecName: "config-data") pod "aa8c76cd-9852-45cc-82fc-c9ee472f94c2" (UID: "aa8c76cd-9852-45cc-82fc-c9ee472f94c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.033038 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data" (OuterVolumeSpecName: "config-data") pod "f364e779-d2db-4f23-bc99-1d0b91dca426" (UID: "f364e779-d2db-4f23-bc99-1d0b91dca426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.060397 4895 scope.go:117] "RemoveContainer" containerID="c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.062005 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0\": container with ID starting with c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0 not found: ID does not exist" containerID="c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.062039 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0"} err="failed to get container status \"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0\": rpc error: code = NotFound desc = could not find container \"c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0\": container with ID starting with c49402e43fd9cbd8200434847bc5b4025bc3e7dba9f2532b8298027a4fd520c0 not found: ID does not exist" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.062066 4895 scope.go:117] "RemoveContainer" containerID="0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.062627 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee\": container with ID starting with 0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee not found: ID does not exist" containerID="0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.062727 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee"} err="failed to get container status \"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee\": rpc error: code = NotFound desc = could not find container \"0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee\": container with ID starting with 0e4f72afbb3acdc4ac9d10239e9306ac7cf741136eeba97e54ae544e864df0ee not found: ID does not exist" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.062802 4895 scope.go:117] "RemoveContainer" containerID="a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.069523 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f364e779-d2db-4f23-bc99-1d0b91dca426" (UID: "f364e779-d2db-4f23-bc99-1d0b91dca426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087648 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f364e779-d2db-4f23-bc99-1d0b91dca426-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087674 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kx86\" (UniqueName: \"kubernetes.io/projected/f364e779-d2db-4f23-bc99-1d0b91dca426-kube-api-access-9kx86\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087683 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087692 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087701 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087708 4895 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087718 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r58l\" (UniqueName: \"kubernetes.io/projected/aa8c76cd-9852-45cc-82fc-c9ee472f94c2-kube-api-access-2r58l\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.087727 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f364e779-d2db-4f23-bc99-1d0b91dca426-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.095980 4895 scope.go:117] "RemoveContainer" containerID="9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.117748 4895 scope.go:117] "RemoveContainer" containerID="a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.118119 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378\": container with ID starting with a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378 not found: ID does not exist" containerID="a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.118159 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378"} err="failed to get container status \"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378\": rpc error: code = NotFound desc = could not find container \"a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378\": container with ID starting with a51dcaf54c4ee699ab41c3d2b5d38e088599d8df028d506caafbb1c565212378 not found: ID does not exist" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.118186 4895 scope.go:117] "RemoveContainer" containerID="9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.118493 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a\": container with ID starting with 9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a not found: ID does not exist" containerID="9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.118533 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a"} err="failed to get container status \"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a\": rpc error: code = NotFound desc = could not find container \"9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a\": container with ID starting with 9072d4bc261c498a84069b6e9a1da6de906bf13dd131c0fab31a4dc8c0bf1a5a not found: ID does not exist" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.357404 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.395500 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.420917 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.437498 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.451570 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.452071 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" containerName="nova-scheduler-scheduler" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452091 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" containerName="nova-scheduler-scheduler" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.452133 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-log" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452140 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-log" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.452151 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452157 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.452181 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452187 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" Dec 02 09:51:46 crc kubenswrapper[4895]: E1202 09:51:46.452200 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-api" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452207 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-api" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452414 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-log" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452440 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" containerName="nova-metadata-metadata" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452452 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-log" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452466 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" containerName="nova-api-api" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.452474 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" containerName="nova-scheduler-scheduler" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.453645 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.461813 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.463215 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.473141 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.483365 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.485461 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.488175 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.504551 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.522263 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzsk9\" (UniqueName: \"kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9\") pod \"807ab313-d84c-4059-aa53-4d99c8c65192\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.522344 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle\") pod \"807ab313-d84c-4059-aa53-4d99c8c65192\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.522418 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data\") pod \"807ab313-d84c-4059-aa53-4d99c8c65192\" (UID: \"807ab313-d84c-4059-aa53-4d99c8c65192\") " Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.526892 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9" (OuterVolumeSpecName: "kube-api-access-rzsk9") pod "807ab313-d84c-4059-aa53-4d99c8c65192" (UID: "807ab313-d84c-4059-aa53-4d99c8c65192"). InnerVolumeSpecName "kube-api-access-rzsk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.552317 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "807ab313-d84c-4059-aa53-4d99c8c65192" (UID: "807ab313-d84c-4059-aa53-4d99c8c65192"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.563606 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data" (OuterVolumeSpecName: "config-data") pod "807ab313-d84c-4059-aa53-4d99c8c65192" (UID: "807ab313-d84c-4059-aa53-4d99c8c65192"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.626724 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9qk\" (UniqueName: \"kubernetes.io/projected/8f095a39-138a-49b5-b50c-a37ad8adff98-kube-api-access-4b9qk\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.626965 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk9x2\" (UniqueName: \"kubernetes.io/projected/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-kube-api-access-rk9x2\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627028 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-config-data\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627074 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-config-data\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627527 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f095a39-138a-49b5-b50c-a37ad8adff98-logs\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627588 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627619 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-logs\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627719 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzsk9\" (UniqueName: \"kubernetes.io/projected/807ab313-d84c-4059-aa53-4d99c8c65192-kube-api-access-rzsk9\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627761 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.627776 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807ab313-d84c-4059-aa53-4d99c8c65192-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729450 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-config-data\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729595 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f095a39-138a-49b5-b50c-a37ad8adff98-logs\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729628 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729645 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-logs\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729680 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9qk\" (UniqueName: \"kubernetes.io/projected/8f095a39-138a-49b5-b50c-a37ad8adff98-kube-api-access-4b9qk\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729720 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk9x2\" (UniqueName: \"kubernetes.io/projected/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-kube-api-access-rk9x2\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729754 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-config-data\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.729774 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.730139 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f095a39-138a-49b5-b50c-a37ad8adff98-logs\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.730553 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-logs\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.734567 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.734578 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.735237 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f095a39-138a-49b5-b50c-a37ad8adff98-config-data\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.738592 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-config-data\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.749391 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk9x2\" (UniqueName: \"kubernetes.io/projected/cbc28dc3-fbee-4d3d-9c6c-88de443104cf-kube-api-access-rk9x2\") pod \"nova-api-0\" (UID: \"cbc28dc3-fbee-4d3d-9c6c-88de443104cf\") " pod="openstack/nova-api-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.750760 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9qk\" (UniqueName: \"kubernetes.io/projected/8f095a39-138a-49b5-b50c-a37ad8adff98-kube-api-access-4b9qk\") pod \"nova-metadata-0\" (UID: \"8f095a39-138a-49b5-b50c-a37ad8adff98\") " pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.774889 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 09:51:46 crc kubenswrapper[4895]: I1202 09:51:46.804331 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.026382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"807ab313-d84c-4059-aa53-4d99c8c65192","Type":"ContainerDied","Data":"be41583932ef51fc53c34a9b064fb3197e17e33c6119b7d48f9a787e58c0d4bf"} Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.026840 4895 scope.go:117] "RemoveContainer" containerID="afffba8deef3d71484a2795782463d54514cd234af1063a3f87ea98981166f17" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.026681 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.071984 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.098278 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.122451 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.124214 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.126671 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.130508 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.167335 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807ab313-d84c-4059-aa53-4d99c8c65192" path="/var/lib/kubelet/pods/807ab313-d84c-4059-aa53-4d99c8c65192/volumes" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.168981 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8c76cd-9852-45cc-82fc-c9ee472f94c2" path="/var/lib/kubelet/pods/aa8c76cd-9852-45cc-82fc-c9ee472f94c2/volumes" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.170442 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f364e779-d2db-4f23-bc99-1d0b91dca426" path="/var/lib/kubelet/pods/f364e779-d2db-4f23-bc99-1d0b91dca426/volumes" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.246368 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.246448 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjlx\" (UniqueName: \"kubernetes.io/projected/29a50169-d52e-4f5e-afcc-bae4041237b5-kube-api-access-mwjlx\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.246557 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-config-data\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.285386 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.349092 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.349171 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjlx\" (UniqueName: \"kubernetes.io/projected/29a50169-d52e-4f5e-afcc-bae4041237b5-kube-api-access-mwjlx\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.349260 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-config-data\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.399472 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.751453 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjlx\" (UniqueName: \"kubernetes.io/projected/29a50169-d52e-4f5e-afcc-bae4041237b5-kube-api-access-mwjlx\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.751677 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-config-data\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: W1202 09:51:47.754478 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f095a39_138a_49b5_b50c_a37ad8adff98.slice/crio-a6d66d1e2e04ae77c6a7d7732c15f3a48896027d84b84e3ac09bcbf6ad163b44 WatchSource:0}: Error finding container a6d66d1e2e04ae77c6a7d7732c15f3a48896027d84b84e3ac09bcbf6ad163b44: Status 404 returned error can't find the container with id a6d66d1e2e04ae77c6a7d7732c15f3a48896027d84b84e3ac09bcbf6ad163b44 Dec 02 09:51:47 crc kubenswrapper[4895]: I1202 09:51:47.755353 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a50169-d52e-4f5e-afcc-bae4041237b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"29a50169-d52e-4f5e-afcc-bae4041237b5\") " pod="openstack/nova-scheduler-0" Dec 02 09:51:47 crc kubenswrapper[4895]: W1202 09:51:47.756633 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc28dc3_fbee_4d3d_9c6c_88de443104cf.slice/crio-d3b16031d8e4f1e4385abf3f6d007f8a686ac45ada608d8fb73e0c4ba0bf4ea2 WatchSource:0}: Error finding container d3b16031d8e4f1e4385abf3f6d007f8a686ac45ada608d8fb73e0c4ba0bf4ea2: Status 404 returned error can't find the container with id d3b16031d8e4f1e4385abf3f6d007f8a686ac45ada608d8fb73e0c4ba0bf4ea2 Dec 02 09:51:48 crc kubenswrapper[4895]: I1202 09:51:48.040260 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc28dc3-fbee-4d3d-9c6c-88de443104cf","Type":"ContainerStarted","Data":"d3b16031d8e4f1e4385abf3f6d007f8a686ac45ada608d8fb73e0c4ba0bf4ea2"} Dec 02 09:51:48 crc kubenswrapper[4895]: I1202 09:51:48.042214 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f095a39-138a-49b5-b50c-a37ad8adff98","Type":"ContainerStarted","Data":"a6d66d1e2e04ae77c6a7d7732c15f3a48896027d84b84e3ac09bcbf6ad163b44"} Dec 02 09:51:48 crc kubenswrapper[4895]: I1202 09:51:48.051332 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 09:51:48 crc kubenswrapper[4895]: I1202 09:51:48.559997 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 09:51:48 crc kubenswrapper[4895]: W1202 09:51:48.562838 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a50169_d52e_4f5e_afcc_bae4041237b5.slice/crio-976783689a011b256fccfbb3069c79eb0524be3273aa6329c23eadfef0b9f623 WatchSource:0}: Error finding container 976783689a011b256fccfbb3069c79eb0524be3273aa6329c23eadfef0b9f623: Status 404 returned error can't find the container with id 976783689a011b256fccfbb3069c79eb0524be3273aa6329c23eadfef0b9f623 Dec 02 09:51:49 crc kubenswrapper[4895]: I1202 09:51:49.054979 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29a50169-d52e-4f5e-afcc-bae4041237b5","Type":"ContainerStarted","Data":"976783689a011b256fccfbb3069c79eb0524be3273aa6329c23eadfef0b9f623"} Dec 02 09:51:50 crc kubenswrapper[4895]: I1202 09:51:50.065815 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f095a39-138a-49b5-b50c-a37ad8adff98","Type":"ContainerStarted","Data":"9dce941e9e277973725014e49a57710886bf3c8d33fd804426a80a36ffcc78c8"} Dec 02 09:51:50 crc kubenswrapper[4895]: I1202 09:51:50.067399 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"29a50169-d52e-4f5e-afcc-bae4041237b5","Type":"ContainerStarted","Data":"c86d0f38838ad96133c2b4930f94f4f90cdfd0c073ed62e0aae9cf0e62592638"} Dec 02 09:51:50 crc kubenswrapper[4895]: I1202 09:51:50.069008 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc28dc3-fbee-4d3d-9c6c-88de443104cf","Type":"ContainerStarted","Data":"888b8a785180619a064fe5355afcfacb067a521271ba0660a2406aa11827f0c1"} Dec 02 09:51:50 crc kubenswrapper[4895]: I1202 09:51:50.088395 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.088377438 podStartE2EDuration="3.088377438s" podCreationTimestamp="2025-12-02 09:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:51:50.082616879 +0000 UTC m=+8921.253476502" watchObservedRunningTime="2025-12-02 09:51:50.088377438 +0000 UTC m=+8921.259237051" Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.080039 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cbc28dc3-fbee-4d3d-9c6c-88de443104cf","Type":"ContainerStarted","Data":"33355bb4fc73156d2510baee487072af31d1ed544644c1c163afb2e9796b2ad0"} Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.085273 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f095a39-138a-49b5-b50c-a37ad8adff98","Type":"ContainerStarted","Data":"fa954b72121576bf10a507cb57912e375844dc42d08cab06a2b956c649591778"} Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.104510 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.104473886 podStartE2EDuration="5.104473886s" podCreationTimestamp="2025-12-02 09:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:51:51.103038661 +0000 UTC m=+8922.273898284" watchObservedRunningTime="2025-12-02 09:51:51.104473886 +0000 UTC m=+8922.275333499" Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.126982 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.126965857 podStartE2EDuration="5.126965857s" podCreationTimestamp="2025-12-02 09:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:51:51.119221176 +0000 UTC m=+8922.290080789" watchObservedRunningTime="2025-12-02 09:51:51.126965857 +0000 UTC m=+8922.297825470" Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.775753 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:51:51 crc kubenswrapper[4895]: I1202 09:51:51.776130 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 09:51:53 crc kubenswrapper[4895]: I1202 09:51:53.052213 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 09:51:53 crc kubenswrapper[4895]: I1202 09:51:53.415450 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 09:51:53 crc kubenswrapper[4895]: I1202 09:51:53.417072 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 09:51:56 crc kubenswrapper[4895]: I1202 09:51:56.775787 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:51:56 crc kubenswrapper[4895]: I1202 09:51:56.776161 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 09:51:56 crc kubenswrapper[4895]: I1202 09:51:56.806013 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:51:56 crc kubenswrapper[4895]: I1202 09:51:56.806993 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 09:51:57 crc kubenswrapper[4895]: I1202 09:51:57.816120 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f095a39-138a-49b5-b50c-a37ad8adff98" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:51:57 crc kubenswrapper[4895]: I1202 09:51:57.938941 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbc28dc3-fbee-4d3d-9c6c-88de443104cf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:51:57 crc kubenswrapper[4895]: I1202 09:51:57.939027 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f095a39-138a-49b5-b50c-a37ad8adff98" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:51:57 crc kubenswrapper[4895]: I1202 09:51:57.939068 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cbc28dc3-fbee-4d3d-9c6c-88de443104cf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 09:51:58 crc kubenswrapper[4895]: I1202 09:51:58.052237 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 09:51:58 crc kubenswrapper[4895]: I1202 09:51:58.082061 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 09:51:58 crc kubenswrapper[4895]: I1202 09:51:58.187792 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 09:52:05 crc kubenswrapper[4895]: I1202 09:52:05.473119 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:52:05 crc kubenswrapper[4895]: I1202 09:52:05.473689 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.779519 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.779672 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.782235 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.784261 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.811769 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.812159 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.812606 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 09:52:06 crc kubenswrapper[4895]: I1202 09:52:06.814675 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:52:07 crc kubenswrapper[4895]: I1202 09:52:07.237255 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 09:52:07 crc kubenswrapper[4895]: I1202 09:52:07.242198 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 09:52:35 crc kubenswrapper[4895]: I1202 09:52:35.473701 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:52:35 crc kubenswrapper[4895]: I1202 09:52:35.474365 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:52:35 crc kubenswrapper[4895]: I1202 09:52:35.474414 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:52:35 crc kubenswrapper[4895]: I1202 09:52:35.475215 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:52:35 crc kubenswrapper[4895]: I1202 09:52:35.475281 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a" gracePeriod=600 Dec 02 09:52:36 crc kubenswrapper[4895]: I1202 09:52:36.569066 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a" exitCode=0 Dec 02 09:52:36 crc kubenswrapper[4895]: I1202 09:52:36.569142 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a"} Dec 02 09:52:36 crc kubenswrapper[4895]: I1202 09:52:36.569855 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823"} Dec 02 09:52:36 crc kubenswrapper[4895]: I1202 09:52:36.569902 4895 scope.go:117] "RemoveContainer" containerID="76c4faf03a96e2caf1e3feba1cf53c4dfd844f1889ed70d385af4e8c6d45622d" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.091698 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.094777 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.098314 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hd6d\" (UniqueName: \"kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.098488 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.098727 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.104663 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.201422 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hd6d\" (UniqueName: \"kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.201722 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.202002 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.202525 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.202641 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.230927 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hd6d\" (UniqueName: \"kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d\") pod \"redhat-marketplace-4j9zn\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.446613 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:00 crc kubenswrapper[4895]: I1202 09:53:00.938345 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:01 crc kubenswrapper[4895]: I1202 09:53:01.836264 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f00628b-173f-490c-85d9-840012b8b0e9" containerID="e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0" exitCode=0 Dec 02 09:53:01 crc kubenswrapper[4895]: I1202 09:53:01.836530 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerDied","Data":"e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0"} Dec 02 09:53:01 crc kubenswrapper[4895]: I1202 09:53:01.837174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerStarted","Data":"09585259157ad2ce65b49a96ca99e4de6388d6d35199c377e9c77e6ab0cf0134"} Dec 02 09:53:01 crc kubenswrapper[4895]: I1202 09:53:01.839867 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 09:53:03 crc kubenswrapper[4895]: I1202 09:53:03.858639 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f00628b-173f-490c-85d9-840012b8b0e9" containerID="0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2" exitCode=0 Dec 02 09:53:03 crc kubenswrapper[4895]: I1202 09:53:03.858997 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerDied","Data":"0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2"} Dec 02 09:53:04 crc kubenswrapper[4895]: I1202 09:53:04.885241 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerStarted","Data":"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221"} Dec 02 09:53:04 crc kubenswrapper[4895]: I1202 09:53:04.911816 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4j9zn" podStartSLOduration=2.470953357 podStartE2EDuration="4.911794448s" podCreationTimestamp="2025-12-02 09:53:00 +0000 UTC" firstStartedPulling="2025-12-02 09:53:01.839527497 +0000 UTC m=+8993.010387110" lastFinishedPulling="2025-12-02 09:53:04.280368598 +0000 UTC m=+8995.451228201" observedRunningTime="2025-12-02 09:53:04.911181519 +0000 UTC m=+8996.082041132" watchObservedRunningTime="2025-12-02 09:53:04.911794448 +0000 UTC m=+8996.082654051" Dec 02 09:53:10 crc kubenswrapper[4895]: I1202 09:53:10.447386 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:10 crc kubenswrapper[4895]: I1202 09:53:10.447978 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:10 crc kubenswrapper[4895]: I1202 09:53:10.497638 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:10 crc kubenswrapper[4895]: I1202 09:53:10.998974 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:12 crc kubenswrapper[4895]: I1202 09:53:12.684935 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:12 crc kubenswrapper[4895]: I1202 09:53:12.965303 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4j9zn" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="registry-server" containerID="cri-o://1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221" gracePeriod=2 Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.547926 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.712357 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities\") pod \"3f00628b-173f-490c-85d9-840012b8b0e9\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.712840 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content\") pod \"3f00628b-173f-490c-85d9-840012b8b0e9\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.712952 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hd6d\" (UniqueName: \"kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d\") pod \"3f00628b-173f-490c-85d9-840012b8b0e9\" (UID: \"3f00628b-173f-490c-85d9-840012b8b0e9\") " Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.714939 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities" (OuterVolumeSpecName: "utilities") pod "3f00628b-173f-490c-85d9-840012b8b0e9" (UID: "3f00628b-173f-490c-85d9-840012b8b0e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.729538 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d" (OuterVolumeSpecName: "kube-api-access-2hd6d") pod "3f00628b-173f-490c-85d9-840012b8b0e9" (UID: "3f00628b-173f-490c-85d9-840012b8b0e9"). InnerVolumeSpecName "kube-api-access-2hd6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.740188 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f00628b-173f-490c-85d9-840012b8b0e9" (UID: "3f00628b-173f-490c-85d9-840012b8b0e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.816426 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hd6d\" (UniqueName: \"kubernetes.io/projected/3f00628b-173f-490c-85d9-840012b8b0e9-kube-api-access-2hd6d\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.816473 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.816486 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f00628b-173f-490c-85d9-840012b8b0e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.978890 4895 generic.go:334] "Generic (PLEG): container finished" podID="3f00628b-173f-490c-85d9-840012b8b0e9" containerID="1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221" exitCode=0 Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.978943 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerDied","Data":"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221"} Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.978977 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j9zn" event={"ID":"3f00628b-173f-490c-85d9-840012b8b0e9","Type":"ContainerDied","Data":"09585259157ad2ce65b49a96ca99e4de6388d6d35199c377e9c77e6ab0cf0134"} Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.978970 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j9zn" Dec 02 09:53:13 crc kubenswrapper[4895]: I1202 09:53:13.978991 4895 scope.go:117] "RemoveContainer" containerID="1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.004633 4895 scope.go:117] "RemoveContainer" containerID="0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.030472 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.039989 4895 scope.go:117] "RemoveContainer" containerID="e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.043336 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j9zn"] Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.095225 4895 scope.go:117] "RemoveContainer" containerID="1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221" Dec 02 09:53:14 crc kubenswrapper[4895]: E1202 09:53:14.095890 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221\": container with ID starting with 1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221 not found: ID does not exist" containerID="1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.095930 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221"} err="failed to get container status \"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221\": rpc error: code = NotFound desc = could not find container \"1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221\": container with ID starting with 1eb774d9a6dd1e92ab120de579659198e3585ce0b5b8405732c2919754f12221 not found: ID does not exist" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.095962 4895 scope.go:117] "RemoveContainer" containerID="0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2" Dec 02 09:53:14 crc kubenswrapper[4895]: E1202 09:53:14.096296 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2\": container with ID starting with 0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2 not found: ID does not exist" containerID="0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.096321 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2"} err="failed to get container status \"0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2\": rpc error: code = NotFound desc = could not find container \"0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2\": container with ID starting with 0e52ff5a0257563880bf4890ba0c4c8713d1bb9082918b4ce635380e3231b6b2 not found: ID does not exist" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.096338 4895 scope.go:117] "RemoveContainer" containerID="e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0" Dec 02 09:53:14 crc kubenswrapper[4895]: E1202 09:53:14.096699 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0\": container with ID starting with e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0 not found: ID does not exist" containerID="e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0" Dec 02 09:53:14 crc kubenswrapper[4895]: I1202 09:53:14.096728 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0"} err="failed to get container status \"e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0\": rpc error: code = NotFound desc = could not find container \"e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0\": container with ID starting with e70c5a9c4b3a99e097f6bddb6747171f249613fb10f2417065fb8194228d55d0 not found: ID does not exist" Dec 02 09:53:15 crc kubenswrapper[4895]: I1202 09:53:15.154899 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" path="/var/lib/kubelet/pods/3f00628b-173f-490c-85d9-840012b8b0e9/volumes" Dec 02 09:54:35 crc kubenswrapper[4895]: I1202 09:54:35.473700 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:54:35 crc kubenswrapper[4895]: I1202 09:54:35.474418 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:55:05 crc kubenswrapper[4895]: I1202 09:55:05.473275 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:55:05 crc kubenswrapper[4895]: I1202 09:55:05.474083 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.871777 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pgd7z"] Dec 02 09:55:28 crc kubenswrapper[4895]: E1202 09:55:28.873185 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="extract-utilities" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.873208 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="extract-utilities" Dec 02 09:55:28 crc kubenswrapper[4895]: E1202 09:55:28.873231 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="registry-server" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.873239 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="registry-server" Dec 02 09:55:28 crc kubenswrapper[4895]: E1202 09:55:28.873262 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="extract-content" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.873269 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="extract-content" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.873549 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f00628b-173f-490c-85d9-840012b8b0e9" containerName="registry-server" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.877017 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:28 crc kubenswrapper[4895]: I1202 09:55:28.889663 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgd7z"] Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.026299 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-catalog-content\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.026880 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-utilities\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.027248 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szq77\" (UniqueName: \"kubernetes.io/projected/755e130f-ee77-449d-92fe-2ca53ab52fbd-kube-api-access-szq77\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.129351 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szq77\" (UniqueName: \"kubernetes.io/projected/755e130f-ee77-449d-92fe-2ca53ab52fbd-kube-api-access-szq77\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.129442 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-catalog-content\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.129593 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-utilities\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.130160 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-catalog-content\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.130236 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/755e130f-ee77-449d-92fe-2ca53ab52fbd-utilities\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.150792 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szq77\" (UniqueName: \"kubernetes.io/projected/755e130f-ee77-449d-92fe-2ca53ab52fbd-kube-api-access-szq77\") pod \"community-operators-pgd7z\" (UID: \"755e130f-ee77-449d-92fe-2ca53ab52fbd\") " pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.203549 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:29 crc kubenswrapper[4895]: I1202 09:55:29.810226 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgd7z"] Dec 02 09:55:30 crc kubenswrapper[4895]: I1202 09:55:30.539223 4895 generic.go:334] "Generic (PLEG): container finished" podID="755e130f-ee77-449d-92fe-2ca53ab52fbd" containerID="bac2cfd87a67b0e17909558bdc5f80441269184c8a89248f8eb0faaeb90caf74" exitCode=0 Dec 02 09:55:30 crc kubenswrapper[4895]: I1202 09:55:30.539669 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgd7z" event={"ID":"755e130f-ee77-449d-92fe-2ca53ab52fbd","Type":"ContainerDied","Data":"bac2cfd87a67b0e17909558bdc5f80441269184c8a89248f8eb0faaeb90caf74"} Dec 02 09:55:30 crc kubenswrapper[4895]: I1202 09:55:30.539722 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgd7z" event={"ID":"755e130f-ee77-449d-92fe-2ca53ab52fbd","Type":"ContainerStarted","Data":"d770de189093e16776c1274943fcd30e4555cbd2c746dae639370e8a1de8f3ff"} Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.473709 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.474330 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.474378 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.474933 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.474989 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" gracePeriod=600 Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.595818 4895 generic.go:334] "Generic (PLEG): container finished" podID="755e130f-ee77-449d-92fe-2ca53ab52fbd" containerID="db390a99672ed8e48336f48b84a61c5e92d569822a5c8a37ed290b817b9c214c" exitCode=0 Dec 02 09:55:35 crc kubenswrapper[4895]: I1202 09:55:35.595863 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgd7z" event={"ID":"755e130f-ee77-449d-92fe-2ca53ab52fbd","Type":"ContainerDied","Data":"db390a99672ed8e48336f48b84a61c5e92d569822a5c8a37ed290b817b9c214c"} Dec 02 09:55:35 crc kubenswrapper[4895]: E1202 09:55:35.599780 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:55:36 crc kubenswrapper[4895]: I1202 09:55:36.608540 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" exitCode=0 Dec 02 09:55:36 crc kubenswrapper[4895]: I1202 09:55:36.608606 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823"} Dec 02 09:55:36 crc kubenswrapper[4895]: I1202 09:55:36.608942 4895 scope.go:117] "RemoveContainer" containerID="3875b8aae4bd4661e8a8ff646bd271b0c2ad8bb55fb213065a84de84dd95c15a" Dec 02 09:55:36 crc kubenswrapper[4895]: I1202 09:55:36.609823 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:55:36 crc kubenswrapper[4895]: E1202 09:55:36.610089 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:55:38 crc kubenswrapper[4895]: I1202 09:55:38.647223 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pgd7z" event={"ID":"755e130f-ee77-449d-92fe-2ca53ab52fbd","Type":"ContainerStarted","Data":"e63a85a35c410beba48ff49fe5e46ce35fb34f1d1d6eb4b941b99acaaa7d5522"} Dec 02 09:55:38 crc kubenswrapper[4895]: I1202 09:55:38.667989 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pgd7z" podStartSLOduration=3.850243769 podStartE2EDuration="10.667971154s" podCreationTimestamp="2025-12-02 09:55:28 +0000 UTC" firstStartedPulling="2025-12-02 09:55:30.542411617 +0000 UTC m=+9141.713271260" lastFinishedPulling="2025-12-02 09:55:37.360139032 +0000 UTC m=+9148.530998645" observedRunningTime="2025-12-02 09:55:38.66721936 +0000 UTC m=+9149.838079003" watchObservedRunningTime="2025-12-02 09:55:38.667971154 +0000 UTC m=+9149.838830757" Dec 02 09:55:39 crc kubenswrapper[4895]: I1202 09:55:39.204770 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:39 crc kubenswrapper[4895]: I1202 09:55:39.204816 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:40 crc kubenswrapper[4895]: I1202 09:55:40.252148 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pgd7z" podUID="755e130f-ee77-449d-92fe-2ca53ab52fbd" containerName="registry-server" probeResult="failure" output=< Dec 02 09:55:40 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 09:55:40 crc kubenswrapper[4895]: > Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.252633 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.305807 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pgd7z" Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.373980 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pgd7z"] Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.504876 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.505243 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wbpq" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="registry-server" containerID="cri-o://00de8ed32c50f197fdac75aa57eedb83dae05e429968e44ebd0c0e2e19d986f7" gracePeriod=2 Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.752840 4895 generic.go:334] "Generic (PLEG): container finished" podID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerID="00de8ed32c50f197fdac75aa57eedb83dae05e429968e44ebd0c0e2e19d986f7" exitCode=0 Dec 02 09:55:49 crc kubenswrapper[4895]: I1202 09:55:49.752915 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerDied","Data":"00de8ed32c50f197fdac75aa57eedb83dae05e429968e44ebd0c0e2e19d986f7"} Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.025547 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.091083 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-746nv\" (UniqueName: \"kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv\") pod \"2bc79272-641a-44ab-b45c-d459fbdb4f81\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.091166 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content\") pod \"2bc79272-641a-44ab-b45c-d459fbdb4f81\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.091210 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities\") pod \"2bc79272-641a-44ab-b45c-d459fbdb4f81\" (UID: \"2bc79272-641a-44ab-b45c-d459fbdb4f81\") " Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.092348 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities" (OuterVolumeSpecName: "utilities") pod "2bc79272-641a-44ab-b45c-d459fbdb4f81" (UID: "2bc79272-641a-44ab-b45c-d459fbdb4f81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.102438 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv" (OuterVolumeSpecName: "kube-api-access-746nv") pod "2bc79272-641a-44ab-b45c-d459fbdb4f81" (UID: "2bc79272-641a-44ab-b45c-d459fbdb4f81"). InnerVolumeSpecName "kube-api-access-746nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.171041 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bc79272-641a-44ab-b45c-d459fbdb4f81" (UID: "2bc79272-641a-44ab-b45c-d459fbdb4f81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.194568 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-746nv\" (UniqueName: \"kubernetes.io/projected/2bc79272-641a-44ab-b45c-d459fbdb4f81-kube-api-access-746nv\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.194620 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.194631 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc79272-641a-44ab-b45c-d459fbdb4f81-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.765769 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wbpq" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.765788 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wbpq" event={"ID":"2bc79272-641a-44ab-b45c-d459fbdb4f81","Type":"ContainerDied","Data":"3eae0340249f3ad15a5fc4ba9fdf300879a28280bdf2fbcf30c0f54352910b20"} Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.765854 4895 scope.go:117] "RemoveContainer" containerID="00de8ed32c50f197fdac75aa57eedb83dae05e429968e44ebd0c0e2e19d986f7" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.812914 4895 scope.go:117] "RemoveContainer" containerID="95c2ac2f7c36142d5657f3ec2b136e73b2a9689a9996befad56e0eb984388af0" Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.814466 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.843114 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wbpq"] Dec 02 09:55:50 crc kubenswrapper[4895]: I1202 09:55:50.857914 4895 scope.go:117] "RemoveContainer" containerID="de3bbac5a74a5cc0810dd3da96379636acb4ad04b50cb2ee2965d0422034427e" Dec 02 09:55:51 crc kubenswrapper[4895]: I1202 09:55:51.142204 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:55:51 crc kubenswrapper[4895]: E1202 09:55:51.142896 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:55:51 crc kubenswrapper[4895]: I1202 09:55:51.156561 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" path="/var/lib/kubelet/pods/2bc79272-641a-44ab-b45c-d459fbdb4f81/volumes" Dec 02 09:55:54 crc kubenswrapper[4895]: I1202 09:55:54.822577 4895 generic.go:334] "Generic (PLEG): container finished" podID="427eea9a-0bfb-4a1a-a225-c4264018fd13" containerID="d08d1f1298efd9dca94b4f3c64c6ba23a878075628e619075fca941893bbc1ff" exitCode=0 Dec 02 09:55:54 crc kubenswrapper[4895]: I1202 09:55:54.822664 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" event={"ID":"427eea9a-0bfb-4a1a-a225-c4264018fd13","Type":"ContainerDied","Data":"d08d1f1298efd9dca94b4f3c64c6ba23a878075628e619075fca941893bbc1ff"} Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.323316 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427228 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427286 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427374 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427560 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427646 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c86pw\" (UniqueName: \"kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427710 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427774 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427830 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427882 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.427965 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.428554 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1\") pod \"427eea9a-0bfb-4a1a-a225-c4264018fd13\" (UID: \"427eea9a-0bfb-4a1a-a225-c4264018fd13\") " Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.433599 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph" (OuterVolumeSpecName: "ceph") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.433671 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.438041 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw" (OuterVolumeSpecName: "kube-api-access-c86pw") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "kube-api-access-c86pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.461886 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.464619 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.466634 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.466679 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.467612 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.471851 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.471877 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory" (OuterVolumeSpecName: "inventory") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.472080 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "427eea9a-0bfb-4a1a-a225-c4264018fd13" (UID: "427eea9a-0bfb-4a1a-a225-c4264018fd13"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531297 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531339 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c86pw\" (UniqueName: \"kubernetes.io/projected/427eea9a-0bfb-4a1a-a225-c4264018fd13-kube-api-access-c86pw\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531352 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531367 4895 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531381 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531391 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531402 4895 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ceph\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531415 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531425 4895 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531436 4895 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.531447 4895 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/427eea9a-0bfb-4a1a-a225-c4264018fd13-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.842427 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" event={"ID":"427eea9a-0bfb-4a1a-a225-c4264018fd13","Type":"ContainerDied","Data":"943008a52d482ca4466ba793a4b33e3ecda0fe9003e58305af26af81fb4c1996"} Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.842797 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943008a52d482ca4466ba793a4b33e3ecda0fe9003e58305af26af81fb4c1996" Dec 02 09:55:56 crc kubenswrapper[4895]: I1202 09:55:56.842471 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz" Dec 02 09:56:03 crc kubenswrapper[4895]: I1202 09:56:03.143840 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:56:03 crc kubenswrapper[4895]: E1202 09:56:03.144538 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:56:18 crc kubenswrapper[4895]: I1202 09:56:18.141728 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:56:18 crc kubenswrapper[4895]: E1202 09:56:18.142596 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:56:32 crc kubenswrapper[4895]: I1202 09:56:32.141949 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:56:32 crc kubenswrapper[4895]: E1202 09:56:32.143826 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:56:40 crc kubenswrapper[4895]: I1202 09:56:40.225391 4895 scope.go:117] "RemoveContainer" containerID="c7b3c40bf91c7e828075dd78d6c11e83c46356928884f7cf52c5b3c7f9d084d9" Dec 02 09:56:40 crc kubenswrapper[4895]: I1202 09:56:40.269388 4895 scope.go:117] "RemoveContainer" containerID="ee34f17d4b351f769693e377522b1329f6ba165438405acd911b575a7859bb2b" Dec 02 09:56:40 crc kubenswrapper[4895]: I1202 09:56:40.311407 4895 scope.go:117] "RemoveContainer" containerID="60955e76b9ae44a2b2af220ca17f0704d258b18bc11cd1bf652a395449af1d35" Dec 02 09:56:46 crc kubenswrapper[4895]: I1202 09:56:46.141309 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:56:46 crc kubenswrapper[4895]: E1202 09:56:46.142139 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:57:00 crc kubenswrapper[4895]: I1202 09:57:00.141601 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:57:00 crc kubenswrapper[4895]: E1202 09:57:00.142376 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:57:13 crc kubenswrapper[4895]: I1202 09:57:13.142359 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:57:13 crc kubenswrapper[4895]: E1202 09:57:13.143158 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:57:25 crc kubenswrapper[4895]: I1202 09:57:25.140713 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:57:25 crc kubenswrapper[4895]: E1202 09:57:25.142671 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:57:40 crc kubenswrapper[4895]: I1202 09:57:40.141603 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:57:40 crc kubenswrapper[4895]: E1202 09:57:40.142458 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:57:54 crc kubenswrapper[4895]: I1202 09:57:54.141136 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:57:54 crc kubenswrapper[4895]: E1202 09:57:54.142125 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:58:08 crc kubenswrapper[4895]: I1202 09:58:08.141370 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:58:08 crc kubenswrapper[4895]: E1202 09:58:08.142132 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:58:21 crc kubenswrapper[4895]: I1202 09:58:21.141860 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:58:21 crc kubenswrapper[4895]: E1202 09:58:21.143158 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:58:23 crc kubenswrapper[4895]: I1202 09:58:23.366224 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 09:58:23 crc kubenswrapper[4895]: I1202 09:58:23.367215 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="59279a72-fa91-4e40-ac44-f52fa931e496" containerName="adoption" containerID="cri-o://582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a" gracePeriod=30 Dec 02 09:58:32 crc kubenswrapper[4895]: I1202 09:58:32.140807 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:58:32 crc kubenswrapper[4895]: E1202 09:58:32.141679 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:58:45 crc kubenswrapper[4895]: I1202 09:58:45.141448 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:58:45 crc kubenswrapper[4895]: E1202 09:58:45.142728 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.881365 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.923135 4895 generic.go:334] "Generic (PLEG): container finished" podID="59279a72-fa91-4e40-ac44-f52fa931e496" containerID="582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a" exitCode=137 Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.923185 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.923186 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"59279a72-fa91-4e40-ac44-f52fa931e496","Type":"ContainerDied","Data":"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a"} Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.923301 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"59279a72-fa91-4e40-ac44-f52fa931e496","Type":"ContainerDied","Data":"aa26abe2c5c22a9e93dd5a9fdf7dd3e27580fdab941ee210dd81f22d731378f0"} Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.923331 4895 scope.go:117] "RemoveContainer" containerID="582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a" Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.947439 4895 scope.go:117] "RemoveContainer" containerID="582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a" Dec 02 09:58:53 crc kubenswrapper[4895]: E1202 09:58:53.948413 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a\": container with ID starting with 582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a not found: ID does not exist" containerID="582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a" Dec 02 09:58:53 crc kubenswrapper[4895]: I1202 09:58:53.948462 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a"} err="failed to get container status \"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a\": rpc error: code = NotFound desc = could not find container \"582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a\": container with ID starting with 582d8667705abd9bc581afdc5c507c8174f696d78d92df0367b69d8228c50e9a not found: ID does not exist" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.058838 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") pod \"59279a72-fa91-4e40-ac44-f52fa931e496\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.059026 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjg7c\" (UniqueName: \"kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c\") pod \"59279a72-fa91-4e40-ac44-f52fa931e496\" (UID: \"59279a72-fa91-4e40-ac44-f52fa931e496\") " Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.067358 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c" (OuterVolumeSpecName: "kube-api-access-qjg7c") pod "59279a72-fa91-4e40-ac44-f52fa931e496" (UID: "59279a72-fa91-4e40-ac44-f52fa931e496"). InnerVolumeSpecName "kube-api-access-qjg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.083338 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a" (OuterVolumeSpecName: "mariadb-data") pod "59279a72-fa91-4e40-ac44-f52fa931e496" (UID: "59279a72-fa91-4e40-ac44-f52fa931e496"). InnerVolumeSpecName "pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.161599 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjg7c\" (UniqueName: \"kubernetes.io/projected/59279a72-fa91-4e40-ac44-f52fa931e496-kube-api-access-qjg7c\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.161862 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") on node \"crc\" " Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.192839 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.193007 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a") on node "crc" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.264639 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.265379 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50e93f03-cf63-4f19-9375-5c838d5fbb9a\") on node \"crc\" DevicePath \"\"" Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.278289 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.922889 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 02 09:58:54 crc kubenswrapper[4895]: I1202 09:58:54.923406 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" containerName="adoption" containerID="cri-o://922ca77dd580b2b692105a955f4ea6aec16dfc8119cb2bf53760f1b64a2e2119" gracePeriod=30 Dec 02 09:58:55 crc kubenswrapper[4895]: I1202 09:58:55.152383 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59279a72-fa91-4e40-ac44-f52fa931e496" path="/var/lib/kubelet/pods/59279a72-fa91-4e40-ac44-f52fa931e496/volumes" Dec 02 09:58:57 crc kubenswrapper[4895]: I1202 09:58:57.144801 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:58:57 crc kubenswrapper[4895]: E1202 09:58:57.145671 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:59:11 crc kubenswrapper[4895]: I1202 09:59:11.141442 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:59:11 crc kubenswrapper[4895]: E1202 09:59:11.142285 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:59:23 crc kubenswrapper[4895]: I1202 09:59:23.142001 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:59:23 crc kubenswrapper[4895]: E1202 09:59:23.142600 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:59:25 crc kubenswrapper[4895]: I1202 09:59:25.259779 4895 generic.go:334] "Generic (PLEG): container finished" podID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" containerID="922ca77dd580b2b692105a955f4ea6aec16dfc8119cb2bf53760f1b64a2e2119" exitCode=137 Dec 02 09:59:25 crc kubenswrapper[4895]: I1202 09:59:25.259906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"1b36ab26-6b37-4af5-bafe-35ef3c888ea9","Type":"ContainerDied","Data":"922ca77dd580b2b692105a955f4ea6aec16dfc8119cb2bf53760f1b64a2e2119"} Dec 02 09:59:25 crc kubenswrapper[4895]: I1202 09:59:25.971633 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.090377 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert\") pod \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.090591 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx\") pod \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.091487 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") pod \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\" (UID: \"1b36ab26-6b37-4af5-bafe-35ef3c888ea9\") " Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.147454 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "1b36ab26-6b37-4af5-bafe-35ef3c888ea9" (UID: "1b36ab26-6b37-4af5-bafe-35ef3c888ea9"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.147620 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx" (OuterVolumeSpecName: "kube-api-access-xkplx") pod "1b36ab26-6b37-4af5-bafe-35ef3c888ea9" (UID: "1b36ab26-6b37-4af5-bafe-35ef3c888ea9"). InnerVolumeSpecName "kube-api-access-xkplx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.148114 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37" (OuterVolumeSpecName: "ovn-data") pod "1b36ab26-6b37-4af5-bafe-35ef3c888ea9" (UID: "1b36ab26-6b37-4af5-bafe-35ef3c888ea9"). InnerVolumeSpecName "pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.194704 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkplx\" (UniqueName: \"kubernetes.io/projected/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-kube-api-access-xkplx\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.194822 4895 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") on node \"crc\" " Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.195022 4895 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/1b36ab26-6b37-4af5-bafe-35ef3c888ea9-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.225124 4895 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.225305 4895 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37") on node "crc" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.281814 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"1b36ab26-6b37-4af5-bafe-35ef3c888ea9","Type":"ContainerDied","Data":"554d77df12aa559d3405ab66ffef7756667388cf42d2939f9d381efac894f53c"} Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.281881 4895 scope.go:117] "RemoveContainer" containerID="922ca77dd580b2b692105a955f4ea6aec16dfc8119cb2bf53760f1b64a2e2119" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.281891 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.296705 4895 reconciler_common.go:293] "Volume detached for volume \"pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95fa9521-57ce-4e7f-b49d-775cd966ae37\") on node \"crc\" DevicePath \"\"" Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.355050 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 02 09:59:26 crc kubenswrapper[4895]: I1202 09:59:26.365099 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 02 09:59:27 crc kubenswrapper[4895]: I1202 09:59:27.155824 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" path="/var/lib/kubelet/pods/1b36ab26-6b37-4af5-bafe-35ef3c888ea9/volumes" Dec 02 09:59:38 crc kubenswrapper[4895]: I1202 09:59:38.141215 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:59:38 crc kubenswrapper[4895]: E1202 09:59:38.142032 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 09:59:49 crc kubenswrapper[4895]: I1202 09:59:49.141815 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 09:59:49 crc kubenswrapper[4895]: E1202 09:59:49.142822 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.148444 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg"] Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149594 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="extract-content" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149611 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="extract-content" Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149632 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="registry-server" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149638 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="registry-server" Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149653 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="extract-utilities" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149659 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="extract-utilities" Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149684 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149690 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149701 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427eea9a-0bfb-4a1a-a225-c4264018fd13" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149709 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="427eea9a-0bfb-4a1a-a225-c4264018fd13" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 02 10:00:00 crc kubenswrapper[4895]: E1202 10:00:00.149723 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59279a72-fa91-4e40-ac44-f52fa931e496" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149729 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="59279a72-fa91-4e40-ac44-f52fa931e496" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149957 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b36ab26-6b37-4af5-bafe-35ef3c888ea9" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.149974 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc79272-641a-44ab-b45c-d459fbdb4f81" containerName="registry-server" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.150006 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="59279a72-fa91-4e40-ac44-f52fa931e496" containerName="adoption" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.150033 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="427eea9a-0bfb-4a1a-a225-c4264018fd13" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.151109 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.154168 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.155014 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.164111 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg"] Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.271859 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.272083 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.272243 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmzl\" (UniqueName: \"kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.374629 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.374701 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.374732 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmzl\" (UniqueName: \"kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.375722 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.381463 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.395795 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmzl\" (UniqueName: \"kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl\") pod \"collect-profiles-29411160-gcflg\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:00 crc kubenswrapper[4895]: I1202 10:00:00.481566 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:01 crc kubenswrapper[4895]: I1202 10:00:01.009814 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg"] Dec 02 10:00:01 crc kubenswrapper[4895]: I1202 10:00:01.658340 4895 generic.go:334] "Generic (PLEG): container finished" podID="6d4d9cba-649f-40cb-a92e-9426921f005a" containerID="6118cbf0533dce6df1d94112ad565320fe99aa8b75b8e0358789a062550599f4" exitCode=0 Dec 02 10:00:01 crc kubenswrapper[4895]: I1202 10:00:01.658389 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" event={"ID":"6d4d9cba-649f-40cb-a92e-9426921f005a","Type":"ContainerDied","Data":"6118cbf0533dce6df1d94112ad565320fe99aa8b75b8e0358789a062550599f4"} Dec 02 10:00:01 crc kubenswrapper[4895]: I1202 10:00:01.658654 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" event={"ID":"6d4d9cba-649f-40cb-a92e-9426921f005a","Type":"ContainerStarted","Data":"c3a085ed464fa2efa5ffdc1df5f65df5230246961ea4f3416ca9b470cbab8580"} Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.063122 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.141591 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 10:00:03 crc kubenswrapper[4895]: E1202 10:00:03.142129 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.238260 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcmzl\" (UniqueName: \"kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl\") pod \"6d4d9cba-649f-40cb-a92e-9426921f005a\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.238407 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume\") pod \"6d4d9cba-649f-40cb-a92e-9426921f005a\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.239369 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume\") pod \"6d4d9cba-649f-40cb-a92e-9426921f005a\" (UID: \"6d4d9cba-649f-40cb-a92e-9426921f005a\") " Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.239843 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d4d9cba-649f-40cb-a92e-9426921f005a" (UID: "6d4d9cba-649f-40cb-a92e-9426921f005a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.240268 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d4d9cba-649f-40cb-a92e-9426921f005a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.244900 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d4d9cba-649f-40cb-a92e-9426921f005a" (UID: "6d4d9cba-649f-40cb-a92e-9426921f005a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.244992 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl" (OuterVolumeSpecName: "kube-api-access-xcmzl") pod "6d4d9cba-649f-40cb-a92e-9426921f005a" (UID: "6d4d9cba-649f-40cb-a92e-9426921f005a"). InnerVolumeSpecName "kube-api-access-xcmzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.342003 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcmzl\" (UniqueName: \"kubernetes.io/projected/6d4d9cba-649f-40cb-a92e-9426921f005a-kube-api-access-xcmzl\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.342042 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d4d9cba-649f-40cb-a92e-9426921f005a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.700315 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" event={"ID":"6d4d9cba-649f-40cb-a92e-9426921f005a","Type":"ContainerDied","Data":"c3a085ed464fa2efa5ffdc1df5f65df5230246961ea4f3416ca9b470cbab8580"} Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.700800 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a085ed464fa2efa5ffdc1df5f65df5230246961ea4f3416ca9b470cbab8580" Dec 02 10:00:03 crc kubenswrapper[4895]: I1202 10:00:03.700401 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411160-gcflg" Dec 02 10:00:04 crc kubenswrapper[4895]: I1202 10:00:04.141611 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n"] Dec 02 10:00:04 crc kubenswrapper[4895]: I1202 10:00:04.153731 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411115-v7l6n"] Dec 02 10:00:05 crc kubenswrapper[4895]: I1202 10:00:05.154775 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877abeb1-302a-42d0-8ba3-05353302b9fd" path="/var/lib/kubelet/pods/877abeb1-302a-42d0-8ba3-05353302b9fd/volumes" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.298630 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:16 crc kubenswrapper[4895]: E1202 10:00:16.299650 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4d9cba-649f-40cb-a92e-9426921f005a" containerName="collect-profiles" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.299663 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4d9cba-649f-40cb-a92e-9426921f005a" containerName="collect-profiles" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.299919 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4d9cba-649f-40cb-a92e-9426921f005a" containerName="collect-profiles" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.301479 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.334046 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.487897 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.488094 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.488330 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9wk\" (UniqueName: \"kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.590544 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9wk\" (UniqueName: \"kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.590658 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.590780 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.591430 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.591434 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.610758 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9wk\" (UniqueName: \"kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk\") pod \"certified-operators-tsrj4\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:16 crc kubenswrapper[4895]: I1202 10:00:16.626008 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:17 crc kubenswrapper[4895]: I1202 10:00:17.254363 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:17 crc kubenswrapper[4895]: I1202 10:00:17.844140 4895 generic.go:334] "Generic (PLEG): container finished" podID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerID="346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8" exitCode=0 Dec 02 10:00:17 crc kubenswrapper[4895]: I1202 10:00:17.844251 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerDied","Data":"346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8"} Dec 02 10:00:17 crc kubenswrapper[4895]: I1202 10:00:17.844478 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerStarted","Data":"571569857ddf9cea2832751c47c14e28d9b129999a41606284337476aa39d9c1"} Dec 02 10:00:17 crc kubenswrapper[4895]: I1202 10:00:17.848213 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:00:18 crc kubenswrapper[4895]: I1202 10:00:18.141863 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 10:00:18 crc kubenswrapper[4895]: E1202 10:00:18.142149 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:00:19 crc kubenswrapper[4895]: I1202 10:00:19.867255 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerStarted","Data":"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33"} Dec 02 10:00:20 crc kubenswrapper[4895]: I1202 10:00:20.879929 4895 generic.go:334] "Generic (PLEG): container finished" podID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerID="b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33" exitCode=0 Dec 02 10:00:20 crc kubenswrapper[4895]: I1202 10:00:20.880089 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerDied","Data":"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33"} Dec 02 10:00:21 crc kubenswrapper[4895]: I1202 10:00:21.896035 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerStarted","Data":"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365"} Dec 02 10:00:21 crc kubenswrapper[4895]: I1202 10:00:21.928616 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tsrj4" podStartSLOduration=2.119015379 podStartE2EDuration="5.928593386s" podCreationTimestamp="2025-12-02 10:00:16 +0000 UTC" firstStartedPulling="2025-12-02 10:00:17.847845983 +0000 UTC m=+9429.018705596" lastFinishedPulling="2025-12-02 10:00:21.65742399 +0000 UTC m=+9432.828283603" observedRunningTime="2025-12-02 10:00:21.916318703 +0000 UTC m=+9433.087178326" watchObservedRunningTime="2025-12-02 10:00:21.928593386 +0000 UTC m=+9433.099452999" Dec 02 10:00:26 crc kubenswrapper[4895]: I1202 10:00:26.626557 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:26 crc kubenswrapper[4895]: I1202 10:00:26.627144 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:26 crc kubenswrapper[4895]: I1202 10:00:26.681310 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:27 crc kubenswrapper[4895]: I1202 10:00:27.005235 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:27 crc kubenswrapper[4895]: I1202 10:00:27.062060 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:28 crc kubenswrapper[4895]: I1202 10:00:28.970705 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tsrj4" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="registry-server" containerID="cri-o://4890500419ada747800d334659adc53185225214307643fd9df915803c51b365" gracePeriod=2 Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.883354 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.979710 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities\") pod \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.979847 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content\") pod \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.980107 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9wk\" (UniqueName: \"kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk\") pod \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\" (UID: \"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0\") " Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.982349 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities" (OuterVolumeSpecName: "utilities") pod "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" (UID: "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.985687 4895 generic.go:334] "Generic (PLEG): container finished" podID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerID="4890500419ada747800d334659adc53185225214307643fd9df915803c51b365" exitCode=0 Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.985727 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerDied","Data":"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365"} Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.985773 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsrj4" event={"ID":"c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0","Type":"ContainerDied","Data":"571569857ddf9cea2832751c47c14e28d9b129999a41606284337476aa39d9c1"} Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.985787 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsrj4" Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.985791 4895 scope.go:117] "RemoveContainer" containerID="4890500419ada747800d334659adc53185225214307643fd9df915803c51b365" Dec 02 10:00:29 crc kubenswrapper[4895]: I1202 10:00:29.990857 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk" (OuterVolumeSpecName: "kube-api-access-5q9wk") pod "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" (UID: "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0"). InnerVolumeSpecName "kube-api-access-5q9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.047398 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" (UID: "c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.083348 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.083382 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.083395 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9wk\" (UniqueName: \"kubernetes.io/projected/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0-kube-api-access-5q9wk\") on node \"crc\" DevicePath \"\"" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.087197 4895 scope.go:117] "RemoveContainer" containerID="b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.125029 4895 scope.go:117] "RemoveContainer" containerID="346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.175669 4895 scope.go:117] "RemoveContainer" containerID="4890500419ada747800d334659adc53185225214307643fd9df915803c51b365" Dec 02 10:00:30 crc kubenswrapper[4895]: E1202 10:00:30.176235 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365\": container with ID starting with 4890500419ada747800d334659adc53185225214307643fd9df915803c51b365 not found: ID does not exist" containerID="4890500419ada747800d334659adc53185225214307643fd9df915803c51b365" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.176282 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365"} err="failed to get container status \"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365\": rpc error: code = NotFound desc = could not find container \"4890500419ada747800d334659adc53185225214307643fd9df915803c51b365\": container with ID starting with 4890500419ada747800d334659adc53185225214307643fd9df915803c51b365 not found: ID does not exist" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.176316 4895 scope.go:117] "RemoveContainer" containerID="b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33" Dec 02 10:00:30 crc kubenswrapper[4895]: E1202 10:00:30.176679 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33\": container with ID starting with b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33 not found: ID does not exist" containerID="b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.176707 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33"} err="failed to get container status \"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33\": rpc error: code = NotFound desc = could not find container \"b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33\": container with ID starting with b9753336a05c3416eb210fc6a54988c6ab1972f278539b8a22a8663ba6085a33 not found: ID does not exist" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.176725 4895 scope.go:117] "RemoveContainer" containerID="346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8" Dec 02 10:00:30 crc kubenswrapper[4895]: E1202 10:00:30.177325 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8\": container with ID starting with 346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8 not found: ID does not exist" containerID="346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.177354 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8"} err="failed to get container status \"346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8\": rpc error: code = NotFound desc = could not find container \"346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8\": container with ID starting with 346094e45127ce8dec88c2d13a4c014bb2b12f7f816584c2c92cf0645021c0c8 not found: ID does not exist" Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.320679 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:30 crc kubenswrapper[4895]: I1202 10:00:30.332767 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tsrj4"] Dec 02 10:00:31 crc kubenswrapper[4895]: I1202 10:00:31.157945 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" path="/var/lib/kubelet/pods/c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0/volumes" Dec 02 10:00:32 crc kubenswrapper[4895]: I1202 10:00:32.141259 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 10:00:32 crc kubenswrapper[4895]: E1202 10:00:32.141833 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.371753 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ldm68/must-gather-86887"] Dec 02 10:00:35 crc kubenswrapper[4895]: E1202 10:00:35.373614 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="extract-utilities" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.373648 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="extract-utilities" Dec 02 10:00:35 crc kubenswrapper[4895]: E1202 10:00:35.373693 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="registry-server" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.373704 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="registry-server" Dec 02 10:00:35 crc kubenswrapper[4895]: E1202 10:00:35.373731 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="extract-content" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.373755 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="extract-content" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.374107 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46502aa-0af2-4e3f-9cf9-b4f79c94e7a0" containerName="registry-server" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.376915 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.381859 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldm68"/"openshift-service-ca.crt" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.381857 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldm68"/"kube-root-ca.crt" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.409193 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldm68/must-gather-86887"] Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.445921 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.446169 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppgp\" (UniqueName: \"kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.550816 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.551030 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppgp\" (UniqueName: \"kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.551234 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.580600 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppgp\" (UniqueName: \"kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp\") pod \"must-gather-86887\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:35 crc kubenswrapper[4895]: I1202 10:00:35.712954 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:00:36 crc kubenswrapper[4895]: I1202 10:00:36.260031 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldm68/must-gather-86887"] Dec 02 10:00:37 crc kubenswrapper[4895]: I1202 10:00:37.068450 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/must-gather-86887" event={"ID":"93be2c72-5575-4d6c-adcd-0714307ce225","Type":"ContainerStarted","Data":"8a7b08ee25989bd23c2bbd73c2badfdc1981c92de90f87cb9efb70938315d4e5"} Dec 02 10:00:40 crc kubenswrapper[4895]: I1202 10:00:40.545417 4895 scope.go:117] "RemoveContainer" containerID="d67f5c6d47528a97f5915d40ace3b5638df9ff731d460837082fbe43a6dc6d6b" Dec 02 10:00:44 crc kubenswrapper[4895]: I1202 10:00:44.154780 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/must-gather-86887" event={"ID":"93be2c72-5575-4d6c-adcd-0714307ce225","Type":"ContainerStarted","Data":"5653142cb06c62207dc7036f57ae60397c136dc2f5827ad85d8267faa96f3beb"} Dec 02 10:00:44 crc kubenswrapper[4895]: I1202 10:00:44.155365 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/must-gather-86887" event={"ID":"93be2c72-5575-4d6c-adcd-0714307ce225","Type":"ContainerStarted","Data":"266b7ba0e11efa3164afc7a0af329cc069b5de1b34e3ce72a87a889be48bea66"} Dec 02 10:00:44 crc kubenswrapper[4895]: I1202 10:00:44.177990 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ldm68/must-gather-86887" podStartSLOduration=2.914210087 podStartE2EDuration="9.177969374s" podCreationTimestamp="2025-12-02 10:00:35 +0000 UTC" firstStartedPulling="2025-12-02 10:00:36.257993791 +0000 UTC m=+9447.428853404" lastFinishedPulling="2025-12-02 10:00:42.521753078 +0000 UTC m=+9453.692612691" observedRunningTime="2025-12-02 10:00:44.17366174 +0000 UTC m=+9455.344521373" watchObservedRunningTime="2025-12-02 10:00:44.177969374 +0000 UTC m=+9455.348828987" Dec 02 10:00:46 crc kubenswrapper[4895]: E1202 10:00:46.121256 4895 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.13:49376->38.102.83.13:37351: read tcp 38.102.83.13:49376->38.102.83.13:37351: read: connection reset by peer Dec 02 10:00:46 crc kubenswrapper[4895]: E1202 10:00:46.779968 4895 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:49456->38.102.83.13:37351: write tcp 38.102.83.13:49456->38.102.83.13:37351: write: broken pipe Dec 02 10:00:46 crc kubenswrapper[4895]: E1202 10:00:46.792524 4895 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.13:49486->38.102.83.13:37351: read tcp 38.102.83.13:49486->38.102.83.13:37351: read: connection reset by peer Dec 02 10:00:47 crc kubenswrapper[4895]: I1202 10:00:47.141641 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.139532 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ldm68/crc-debug-4vr6v"] Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.141451 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.144503 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ldm68"/"default-dockercfg-hkw7b" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.217050 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82"} Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.244670 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rmq\" (UniqueName: \"kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.245243 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.348155 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rmq\" (UniqueName: \"kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.348247 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.348712 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.379821 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rmq\" (UniqueName: \"kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq\") pod \"crc-debug-4vr6v\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:48 crc kubenswrapper[4895]: I1202 10:00:48.465177 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:00:49 crc kubenswrapper[4895]: I1202 10:00:49.229043 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" event={"ID":"ad5be1e7-b668-4922-bcf5-3b60622a5328","Type":"ContainerStarted","Data":"fc88dae53f02774c0b1ee2fe2effa693778bc0d38cd2b6b7cd3efb3536f3e133"} Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.154487 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411161-xwnxx"] Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.156568 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.170322 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411161-xwnxx"] Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.229426 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.229575 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4b7b\" (UniqueName: \"kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.229626 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.229878 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.332326 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.332426 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4b7b\" (UniqueName: \"kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.332486 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.332579 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.363124 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.363967 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.371213 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.371633 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4b7b\" (UniqueName: \"kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b\") pod \"keystone-cron-29411161-xwnxx\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:00 crc kubenswrapper[4895]: I1202 10:01:00.481801 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:01 crc kubenswrapper[4895]: I1202 10:01:01.858324 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411161-xwnxx"] Dec 02 10:01:01 crc kubenswrapper[4895]: W1202 10:01:01.863956 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254 WatchSource:0}: Error finding container a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254: Status 404 returned error can't find the container with id a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254 Dec 02 10:01:02 crc kubenswrapper[4895]: I1202 10:01:02.438714 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-xwnxx" event={"ID":"5e89a9f9-bc16-4d4b-ae72-78209734db6f","Type":"ContainerStarted","Data":"f9e16f37d7d30e01358a81dde1b91138c6cb8238daf670badd48ec8c005999c7"} Dec 02 10:01:02 crc kubenswrapper[4895]: I1202 10:01:02.439435 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-xwnxx" event={"ID":"5e89a9f9-bc16-4d4b-ae72-78209734db6f","Type":"ContainerStarted","Data":"a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254"} Dec 02 10:01:02 crc kubenswrapper[4895]: I1202 10:01:02.440650 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" event={"ID":"ad5be1e7-b668-4922-bcf5-3b60622a5328","Type":"ContainerStarted","Data":"dfc356961b28bd4b40a9612f1a182dcc74f63e5782ca71a6736fe938c831e7ec"} Dec 02 10:01:02 crc kubenswrapper[4895]: I1202 10:01:02.462076 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411161-xwnxx" podStartSLOduration=2.462055916 podStartE2EDuration="2.462055916s" podCreationTimestamp="2025-12-02 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 10:01:02.457161184 +0000 UTC m=+9473.628020817" watchObservedRunningTime="2025-12-02 10:01:02.462055916 +0000 UTC m=+9473.632915529" Dec 02 10:01:02 crc kubenswrapper[4895]: I1202 10:01:02.477627 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" podStartSLOduration=1.556915402 podStartE2EDuration="14.477605291s" podCreationTimestamp="2025-12-02 10:00:48 +0000 UTC" firstStartedPulling="2025-12-02 10:00:48.518768527 +0000 UTC m=+9459.689628140" lastFinishedPulling="2025-12-02 10:01:01.439458416 +0000 UTC m=+9472.610318029" observedRunningTime="2025-12-02 10:01:02.47146313 +0000 UTC m=+9473.642322743" watchObservedRunningTime="2025-12-02 10:01:02.477605291 +0000 UTC m=+9473.648464914" Dec 02 10:01:06 crc kubenswrapper[4895]: I1202 10:01:06.504174 4895 generic.go:334] "Generic (PLEG): container finished" podID="5e89a9f9-bc16-4d4b-ae72-78209734db6f" containerID="f9e16f37d7d30e01358a81dde1b91138c6cb8238daf670badd48ec8c005999c7" exitCode=0 Dec 02 10:01:06 crc kubenswrapper[4895]: I1202 10:01:06.504282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-xwnxx" event={"ID":"5e89a9f9-bc16-4d4b-ae72-78209734db6f","Type":"ContainerDied","Data":"f9e16f37d7d30e01358a81dde1b91138c6cb8238daf670badd48ec8c005999c7"} Dec 02 10:01:08 crc kubenswrapper[4895]: I1202 10:01:08.982823 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:08 crc kubenswrapper[4895]: I1202 10:01:08.987001 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys\") pod \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " Dec 02 10:01:08 crc kubenswrapper[4895]: I1202 10:01:08.987271 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle\") pod \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " Dec 02 10:01:08 crc kubenswrapper[4895]: I1202 10:01:08.987396 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data\") pod \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " Dec 02 10:01:08 crc kubenswrapper[4895]: I1202 10:01:08.988520 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4b7b\" (UniqueName: \"kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b\") pod \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\" (UID: \"5e89a9f9-bc16-4d4b-ae72-78209734db6f\") " Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.010000 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b" (OuterVolumeSpecName: "kube-api-access-l4b7b") pod "5e89a9f9-bc16-4d4b-ae72-78209734db6f" (UID: "5e89a9f9-bc16-4d4b-ae72-78209734db6f"). InnerVolumeSpecName "kube-api-access-l4b7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.029225 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5e89a9f9-bc16-4d4b-ae72-78209734db6f" (UID: "5e89a9f9-bc16-4d4b-ae72-78209734db6f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.051617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e89a9f9-bc16-4d4b-ae72-78209734db6f" (UID: "5e89a9f9-bc16-4d4b-ae72-78209734db6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.074667 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data" (OuterVolumeSpecName: "config-data") pod "5e89a9f9-bc16-4d4b-ae72-78209734db6f" (UID: "5e89a9f9-bc16-4d4b-ae72-78209734db6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.092032 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4b7b\" (UniqueName: \"kubernetes.io/projected/5e89a9f9-bc16-4d4b-ae72-78209734db6f-kube-api-access-l4b7b\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.092069 4895 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.092086 4895 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.092106 4895 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e89a9f9-bc16-4d4b-ae72-78209734db6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.554382 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411161-xwnxx" event={"ID":"5e89a9f9-bc16-4d4b-ae72-78209734db6f","Type":"ContainerDied","Data":"a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254"} Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.554815 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254" Dec 02 10:01:09 crc kubenswrapper[4895]: I1202 10:01:09.554907 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411161-xwnxx" Dec 02 10:01:15 crc kubenswrapper[4895]: E1202 10:01:15.660741 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache]" Dec 02 10:01:21 crc kubenswrapper[4895]: I1202 10:01:21.690863 4895 generic.go:334] "Generic (PLEG): container finished" podID="ad5be1e7-b668-4922-bcf5-3b60622a5328" containerID="dfc356961b28bd4b40a9612f1a182dcc74f63e5782ca71a6736fe938c831e7ec" exitCode=0 Dec 02 10:01:21 crc kubenswrapper[4895]: I1202 10:01:21.690899 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" event={"ID":"ad5be1e7-b668-4922-bcf5-3b60622a5328","Type":"ContainerDied","Data":"dfc356961b28bd4b40a9612f1a182dcc74f63e5782ca71a6736fe938c831e7ec"} Dec 02 10:01:22 crc kubenswrapper[4895]: I1202 10:01:22.841556 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:01:22 crc kubenswrapper[4895]: I1202 10:01:22.884790 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ldm68/crc-debug-4vr6v"] Dec 02 10:01:22 crc kubenswrapper[4895]: I1202 10:01:22.895796 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ldm68/crc-debug-4vr6v"] Dec 02 10:01:22 crc kubenswrapper[4895]: I1202 10:01:22.999120 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4rmq\" (UniqueName: \"kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq\") pod \"ad5be1e7-b668-4922-bcf5-3b60622a5328\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " Dec 02 10:01:22 crc kubenswrapper[4895]: I1202 10:01:22.999416 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host\") pod \"ad5be1e7-b668-4922-bcf5-3b60622a5328\" (UID: \"ad5be1e7-b668-4922-bcf5-3b60622a5328\") " Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.000069 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host" (OuterVolumeSpecName: "host") pod "ad5be1e7-b668-4922-bcf5-3b60622a5328" (UID: "ad5be1e7-b668-4922-bcf5-3b60622a5328"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.020476 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq" (OuterVolumeSpecName: "kube-api-access-x4rmq") pod "ad5be1e7-b668-4922-bcf5-3b60622a5328" (UID: "ad5be1e7-b668-4922-bcf5-3b60622a5328"). InnerVolumeSpecName "kube-api-access-x4rmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.102621 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4rmq\" (UniqueName: \"kubernetes.io/projected/ad5be1e7-b668-4922-bcf5-3b60622a5328-kube-api-access-x4rmq\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.102681 4895 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad5be1e7-b668-4922-bcf5-3b60622a5328-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.153976 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5be1e7-b668-4922-bcf5-3b60622a5328" path="/var/lib/kubelet/pods/ad5be1e7-b668-4922-bcf5-3b60622a5328/volumes" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.710713 4895 scope.go:117] "RemoveContainer" containerID="dfc356961b28bd4b40a9612f1a182dcc74f63e5782ca71a6736fe938c831e7ec" Dec 02 10:01:23 crc kubenswrapper[4895]: I1202 10:01:23.710779 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-4vr6v" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.089630 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ldm68/crc-debug-nrmxj"] Dec 02 10:01:24 crc kubenswrapper[4895]: E1202 10:01:24.090226 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e89a9f9-bc16-4d4b-ae72-78209734db6f" containerName="keystone-cron" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.090242 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e89a9f9-bc16-4d4b-ae72-78209734db6f" containerName="keystone-cron" Dec 02 10:01:24 crc kubenswrapper[4895]: E1202 10:01:24.090274 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5be1e7-b668-4922-bcf5-3b60622a5328" containerName="container-00" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.090281 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5be1e7-b668-4922-bcf5-3b60622a5328" containerName="container-00" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.090563 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e89a9f9-bc16-4d4b-ae72-78209734db6f" containerName="keystone-cron" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.090651 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5be1e7-b668-4922-bcf5-3b60622a5328" containerName="container-00" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.091573 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.094153 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ldm68"/"default-dockercfg-hkw7b" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.125652 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjf2s\" (UniqueName: \"kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.125901 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.228418 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjf2s\" (UniqueName: \"kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.228880 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.229043 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.250671 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjf2s\" (UniqueName: \"kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s\") pod \"crc-debug-nrmxj\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.430078 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:24 crc kubenswrapper[4895]: I1202 10:01:24.721845 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" event={"ID":"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca","Type":"ContainerStarted","Data":"4ba227aaecf0c14cc302151fce4bd6fe6e44ecd2ad0208d9997db5a2791cab8f"} Dec 02 10:01:25 crc kubenswrapper[4895]: I1202 10:01:25.732974 4895 generic.go:334] "Generic (PLEG): container finished" podID="10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" containerID="aa0ecaa19b5e942bc5aa3d129c783f300c779b25920881cc68be8f96d3ff16fc" exitCode=1 Dec 02 10:01:25 crc kubenswrapper[4895]: I1202 10:01:25.733044 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" event={"ID":"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca","Type":"ContainerDied","Data":"aa0ecaa19b5e942bc5aa3d129c783f300c779b25920881cc68be8f96d3ff16fc"} Dec 02 10:01:25 crc kubenswrapper[4895]: I1202 10:01:25.781816 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ldm68/crc-debug-nrmxj"] Dec 02 10:01:25 crc kubenswrapper[4895]: I1202 10:01:25.795294 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ldm68/crc-debug-nrmxj"] Dec 02 10:01:25 crc kubenswrapper[4895]: E1202 10:01:25.915791 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache]" Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.866842 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.989546 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjf2s\" (UniqueName: \"kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s\") pod \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.989689 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host\") pod \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\" (UID: \"10a6fdb8-01bc-42b7-9a53-a7e53b0901ca\") " Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.989841 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host" (OuterVolumeSpecName: "host") pod "10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" (UID: "10a6fdb8-01bc-42b7-9a53-a7e53b0901ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.990313 4895 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-host\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:26 crc kubenswrapper[4895]: I1202 10:01:26.998307 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s" (OuterVolumeSpecName: "kube-api-access-pjf2s") pod "10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" (UID: "10a6fdb8-01bc-42b7-9a53-a7e53b0901ca"). InnerVolumeSpecName "kube-api-access-pjf2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:01:27 crc kubenswrapper[4895]: I1202 10:01:27.093062 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjf2s\" (UniqueName: \"kubernetes.io/projected/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca-kube-api-access-pjf2s\") on node \"crc\" DevicePath \"\"" Dec 02 10:01:27 crc kubenswrapper[4895]: I1202 10:01:27.154323 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" path="/var/lib/kubelet/pods/10a6fdb8-01bc-42b7-9a53-a7e53b0901ca/volumes" Dec 02 10:01:27 crc kubenswrapper[4895]: I1202 10:01:27.754521 4895 scope.go:117] "RemoveContainer" containerID="aa0ecaa19b5e942bc5aa3d129c783f300c779b25920881cc68be8f96d3ff16fc" Dec 02 10:01:27 crc kubenswrapper[4895]: I1202 10:01:27.754950 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/crc-debug-nrmxj" Dec 02 10:01:36 crc kubenswrapper[4895]: E1202 10:01:36.234238 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache]" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.437057 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:01:36 crc kubenswrapper[4895]: E1202 10:01:36.437685 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" containerName="container-00" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.437709 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" containerName="container-00" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.438000 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a6fdb8-01bc-42b7-9a53-a7e53b0901ca" containerName="container-00" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.440170 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.458094 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.595169 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhrl\" (UniqueName: \"kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.595240 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.595295 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.697299 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhrl\" (UniqueName: \"kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.697699 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.697902 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.698942 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:36 crc kubenswrapper[4895]: I1202 10:01:36.699041 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:37 crc kubenswrapper[4895]: I1202 10:01:37.053020 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhrl\" (UniqueName: \"kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl\") pod \"redhat-operators-6vrhm\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:37 crc kubenswrapper[4895]: I1202 10:01:37.061340 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:37 crc kubenswrapper[4895]: I1202 10:01:37.683756 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:01:37 crc kubenswrapper[4895]: I1202 10:01:37.916911 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerStarted","Data":"579ac63938c4b2a4c47fa6186a550db284a72463a87a58212a05daa6efab691a"} Dec 02 10:01:38 crc kubenswrapper[4895]: I1202 10:01:38.930033 4895 generic.go:334] "Generic (PLEG): container finished" podID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerID="bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265" exitCode=0 Dec 02 10:01:38 crc kubenswrapper[4895]: I1202 10:01:38.930166 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerDied","Data":"bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265"} Dec 02 10:01:39 crc kubenswrapper[4895]: I1202 10:01:39.941220 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerStarted","Data":"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0"} Dec 02 10:01:43 crc kubenswrapper[4895]: I1202 10:01:43.981217 4895 generic.go:334] "Generic (PLEG): container finished" podID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerID="eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0" exitCode=0 Dec 02 10:01:43 crc kubenswrapper[4895]: I1202 10:01:43.981436 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerDied","Data":"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0"} Dec 02 10:01:44 crc kubenswrapper[4895]: I1202 10:01:44.992672 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerStarted","Data":"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e"} Dec 02 10:01:45 crc kubenswrapper[4895]: I1202 10:01:45.016883 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6vrhm" podStartSLOduration=3.49799375 podStartE2EDuration="9.016863716s" podCreationTimestamp="2025-12-02 10:01:36 +0000 UTC" firstStartedPulling="2025-12-02 10:01:38.932418994 +0000 UTC m=+9510.103278597" lastFinishedPulling="2025-12-02 10:01:44.45128895 +0000 UTC m=+9515.622148563" observedRunningTime="2025-12-02 10:01:45.010055234 +0000 UTC m=+9516.180914857" watchObservedRunningTime="2025-12-02 10:01:45.016863716 +0000 UTC m=+9516.187723329" Dec 02 10:01:46 crc kubenswrapper[4895]: E1202 10:01:46.537277 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache]" Dec 02 10:01:47 crc kubenswrapper[4895]: I1202 10:01:47.062456 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:47 crc kubenswrapper[4895]: I1202 10:01:47.062779 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:48 crc kubenswrapper[4895]: I1202 10:01:48.123842 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6vrhm" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="registry-server" probeResult="failure" output=< Dec 02 10:01:48 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 10:01:48 crc kubenswrapper[4895]: > Dec 02 10:01:56 crc kubenswrapper[4895]: E1202 10:01:56.813911 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache]" Dec 02 10:01:57 crc kubenswrapper[4895]: I1202 10:01:57.293431 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:57 crc kubenswrapper[4895]: I1202 10:01:57.349063 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:01:57 crc kubenswrapper[4895]: I1202 10:01:57.534573 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:01:59 crc kubenswrapper[4895]: I1202 10:01:59.123751 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6vrhm" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="registry-server" containerID="cri-o://10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e" gracePeriod=2 Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.115701 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.175318 4895 generic.go:334] "Generic (PLEG): container finished" podID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerID="10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e" exitCode=0 Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.175370 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6vrhm" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.175383 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerDied","Data":"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e"} Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.175424 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6vrhm" event={"ID":"1c892011-8aa6-4454-ad2e-8bc37d67c957","Type":"ContainerDied","Data":"579ac63938c4b2a4c47fa6186a550db284a72463a87a58212a05daa6efab691a"} Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.175444 4895 scope.go:117] "RemoveContainer" containerID="10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.197280 4895 scope.go:117] "RemoveContainer" containerID="eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.222070 4895 scope.go:117] "RemoveContainer" containerID="bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.224214 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content\") pod \"1c892011-8aa6-4454-ad2e-8bc37d67c957\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.224387 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxhrl\" (UniqueName: \"kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl\") pod \"1c892011-8aa6-4454-ad2e-8bc37d67c957\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.224432 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities\") pod \"1c892011-8aa6-4454-ad2e-8bc37d67c957\" (UID: \"1c892011-8aa6-4454-ad2e-8bc37d67c957\") " Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.225867 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities" (OuterVolumeSpecName: "utilities") pod "1c892011-8aa6-4454-ad2e-8bc37d67c957" (UID: "1c892011-8aa6-4454-ad2e-8bc37d67c957"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.232582 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl" (OuterVolumeSpecName: "kube-api-access-qxhrl") pod "1c892011-8aa6-4454-ad2e-8bc37d67c957" (UID: "1c892011-8aa6-4454-ad2e-8bc37d67c957"). InnerVolumeSpecName "kube-api-access-qxhrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.327039 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxhrl\" (UniqueName: \"kubernetes.io/projected/1c892011-8aa6-4454-ad2e-8bc37d67c957-kube-api-access-qxhrl\") on node \"crc\" DevicePath \"\"" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.327072 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.327823 4895 scope.go:117] "RemoveContainer" containerID="10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e" Dec 02 10:02:00 crc kubenswrapper[4895]: E1202 10:02:00.328697 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e\": container with ID starting with 10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e not found: ID does not exist" containerID="10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.328728 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e"} err="failed to get container status \"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e\": rpc error: code = NotFound desc = could not find container \"10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e\": container with ID starting with 10b3499df03977653d337f2c2b9ca01ac6ac89207582a4d95f36da714361e10e not found: ID does not exist" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.328824 4895 scope.go:117] "RemoveContainer" containerID="eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0" Dec 02 10:02:00 crc kubenswrapper[4895]: E1202 10:02:00.329835 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0\": container with ID starting with eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0 not found: ID does not exist" containerID="eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.329866 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0"} err="failed to get container status \"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0\": rpc error: code = NotFound desc = could not find container \"eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0\": container with ID starting with eeb281812731a05660c03d099d4c1eaa973e7200fe54a39295c56adaab2c2aa0 not found: ID does not exist" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.329884 4895 scope.go:117] "RemoveContainer" containerID="bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265" Dec 02 10:02:00 crc kubenswrapper[4895]: E1202 10:02:00.330158 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265\": container with ID starting with bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265 not found: ID does not exist" containerID="bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.330192 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265"} err="failed to get container status \"bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265\": rpc error: code = NotFound desc = could not find container \"bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265\": container with ID starting with bddd20a96855b0cf3605a80092f899d6108144416cbf215cbd15a1013bbce265 not found: ID does not exist" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.335617 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c892011-8aa6-4454-ad2e-8bc37d67c957" (UID: "1c892011-8aa6-4454-ad2e-8bc37d67c957"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.429864 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c892011-8aa6-4454-ad2e-8bc37d67c957-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.515130 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:02:00 crc kubenswrapper[4895]: I1202 10:02:00.527551 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6vrhm"] Dec 02 10:02:01 crc kubenswrapper[4895]: I1202 10:02:01.154118 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" path="/var/lib/kubelet/pods/1c892011-8aa6-4454-ad2e-8bc37d67c957/volumes" Dec 02 10:02:07 crc kubenswrapper[4895]: E1202 10:02:07.103391 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice/crio-a38650ee789c7c0c486b202209c49a67ec4c90281eb21bb39ba65be02f442254\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e89a9f9_bc16_4d4b_ae72_78209734db6f.slice\": RecentStats: unable to find data in memory cache]" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.598799 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:03 crc kubenswrapper[4895]: E1202 10:03:03.599873 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="registry-server" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.599891 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="registry-server" Dec 02 10:03:03 crc kubenswrapper[4895]: E1202 10:03:03.599934 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="extract-content" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.599947 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="extract-content" Dec 02 10:03:03 crc kubenswrapper[4895]: E1202 10:03:03.600022 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="extract-utilities" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.600034 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="extract-utilities" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.601906 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c892011-8aa6-4454-ad2e-8bc37d67c957" containerName="registry-server" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.603679 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.621021 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.765330 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcvr4\" (UniqueName: \"kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.765449 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.765669 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.867873 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcvr4\" (UniqueName: \"kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.868027 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.868133 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.868613 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:03 crc kubenswrapper[4895]: I1202 10:03:03.868717 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:04 crc kubenswrapper[4895]: I1202 10:03:04.056670 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcvr4\" (UniqueName: \"kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4\") pod \"redhat-marketplace-tk8vt\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:04 crc kubenswrapper[4895]: I1202 10:03:04.241098 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:04 crc kubenswrapper[4895]: I1202 10:03:04.685305 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:05 crc kubenswrapper[4895]: I1202 10:03:05.161953 4895 generic.go:334] "Generic (PLEG): container finished" podID="645f41e4-8317-434a-ba08-a760345778db" containerID="3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95" exitCode=0 Dec 02 10:03:05 crc kubenswrapper[4895]: I1202 10:03:05.162008 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerDied","Data":"3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95"} Dec 02 10:03:05 crc kubenswrapper[4895]: I1202 10:03:05.162286 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerStarted","Data":"9d09862bba5c338fe42b2dbd0f0e3aa12c3e56114c0969783da160501fd8f59b"} Dec 02 10:03:05 crc kubenswrapper[4895]: I1202 10:03:05.473655 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:03:05 crc kubenswrapper[4895]: I1202 10:03:05.473993 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:03:08 crc kubenswrapper[4895]: I1202 10:03:08.196275 4895 generic.go:334] "Generic (PLEG): container finished" podID="645f41e4-8317-434a-ba08-a760345778db" containerID="35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78" exitCode=0 Dec 02 10:03:08 crc kubenswrapper[4895]: I1202 10:03:08.196385 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerDied","Data":"35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78"} Dec 02 10:03:10 crc kubenswrapper[4895]: I1202 10:03:10.220938 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerStarted","Data":"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765"} Dec 02 10:03:10 crc kubenswrapper[4895]: I1202 10:03:10.251500 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tk8vt" podStartSLOduration=3.337123715 podStartE2EDuration="7.251476926s" podCreationTimestamp="2025-12-02 10:03:03 +0000 UTC" firstStartedPulling="2025-12-02 10:03:05.164816642 +0000 UTC m=+9596.335676255" lastFinishedPulling="2025-12-02 10:03:09.079169853 +0000 UTC m=+9600.250029466" observedRunningTime="2025-12-02 10:03:10.240985719 +0000 UTC m=+9601.411845332" watchObservedRunningTime="2025-12-02 10:03:10.251476926 +0000 UTC m=+9601.422336559" Dec 02 10:03:14 crc kubenswrapper[4895]: I1202 10:03:14.241583 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:14 crc kubenswrapper[4895]: I1202 10:03:14.241959 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:14 crc kubenswrapper[4895]: I1202 10:03:14.289685 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:14 crc kubenswrapper[4895]: I1202 10:03:14.335611 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:14 crc kubenswrapper[4895]: I1202 10:03:14.542093 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.280117 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tk8vt" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="registry-server" containerID="cri-o://aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765" gracePeriod=2 Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.831458 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.881493 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities\") pod \"645f41e4-8317-434a-ba08-a760345778db\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.882029 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content\") pod \"645f41e4-8317-434a-ba08-a760345778db\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.882058 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcvr4\" (UniqueName: \"kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4\") pod \"645f41e4-8317-434a-ba08-a760345778db\" (UID: \"645f41e4-8317-434a-ba08-a760345778db\") " Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.883652 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities" (OuterVolumeSpecName: "utilities") pod "645f41e4-8317-434a-ba08-a760345778db" (UID: "645f41e4-8317-434a-ba08-a760345778db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.900366 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4" (OuterVolumeSpecName: "kube-api-access-zcvr4") pod "645f41e4-8317-434a-ba08-a760345778db" (UID: "645f41e4-8317-434a-ba08-a760345778db"). InnerVolumeSpecName "kube-api-access-zcvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.903588 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "645f41e4-8317-434a-ba08-a760345778db" (UID: "645f41e4-8317-434a-ba08-a760345778db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.985228 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.985260 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcvr4\" (UniqueName: \"kubernetes.io/projected/645f41e4-8317-434a-ba08-a760345778db-kube-api-access-zcvr4\") on node \"crc\" DevicePath \"\"" Dec 02 10:03:16 crc kubenswrapper[4895]: I1202 10:03:16.985271 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/645f41e4-8317-434a-ba08-a760345778db-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.300072 4895 generic.go:334] "Generic (PLEG): container finished" podID="645f41e4-8317-434a-ba08-a760345778db" containerID="aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765" exitCode=0 Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.300130 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerDied","Data":"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765"} Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.300174 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tk8vt" event={"ID":"645f41e4-8317-434a-ba08-a760345778db","Type":"ContainerDied","Data":"9d09862bba5c338fe42b2dbd0f0e3aa12c3e56114c0969783da160501fd8f59b"} Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.300191 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tk8vt" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.300201 4895 scope.go:117] "RemoveContainer" containerID="aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.333874 4895 scope.go:117] "RemoveContainer" containerID="35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.335534 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.344183 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tk8vt"] Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.357664 4895 scope.go:117] "RemoveContainer" containerID="3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.437424 4895 scope.go:117] "RemoveContainer" containerID="aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765" Dec 02 10:03:17 crc kubenswrapper[4895]: E1202 10:03:17.438102 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765\": container with ID starting with aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765 not found: ID does not exist" containerID="aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.438153 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765"} err="failed to get container status \"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765\": rpc error: code = NotFound desc = could not find container \"aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765\": container with ID starting with aaf9f9809a4d3486a9ee4a757fbac9124c91e3db84e8a4c342deb0053f22f765 not found: ID does not exist" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.438188 4895 scope.go:117] "RemoveContainer" containerID="35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78" Dec 02 10:03:17 crc kubenswrapper[4895]: E1202 10:03:17.438641 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78\": container with ID starting with 35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78 not found: ID does not exist" containerID="35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.438702 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78"} err="failed to get container status \"35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78\": rpc error: code = NotFound desc = could not find container \"35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78\": container with ID starting with 35f313b206df088fd0e36dedfab3877341965749932cbb6cf1e902815ed5ae78 not found: ID does not exist" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.438728 4895 scope.go:117] "RemoveContainer" containerID="3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95" Dec 02 10:03:17 crc kubenswrapper[4895]: E1202 10:03:17.439646 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95\": container with ID starting with 3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95 not found: ID does not exist" containerID="3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95" Dec 02 10:03:17 crc kubenswrapper[4895]: I1202 10:03:17.439686 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95"} err="failed to get container status \"3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95\": rpc error: code = NotFound desc = could not find container \"3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95\": container with ID starting with 3a7f147877350f4baf044849a99d8e947badbe7f1da0ac4f02b074686b5e3b95 not found: ID does not exist" Dec 02 10:03:19 crc kubenswrapper[4895]: I1202 10:03:19.157465 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="645f41e4-8317-434a-ba08-a760345778db" path="/var/lib/kubelet/pods/645f41e4-8317-434a-ba08-a760345778db/volumes" Dec 02 10:03:35 crc kubenswrapper[4895]: I1202 10:03:35.473248 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:03:35 crc kubenswrapper[4895]: I1202 10:03:35.473787 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.473022 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.473573 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.473624 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.474480 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.474551 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82" gracePeriod=600 Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.784649 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82" exitCode=0 Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.785038 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82"} Dec 02 10:04:05 crc kubenswrapper[4895]: I1202 10:04:05.785081 4895 scope.go:117] "RemoveContainer" containerID="fb24c8085688b9b476b4097e1f5ccaf4a5d4d236157b3b1913f7e381bd3cd823" Dec 02 10:04:06 crc kubenswrapper[4895]: I1202 10:04:06.796780 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8"} Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.250608 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:00 crc kubenswrapper[4895]: E1202 10:06:00.251779 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="registry-server" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.251796 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="registry-server" Dec 02 10:06:00 crc kubenswrapper[4895]: E1202 10:06:00.251829 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="extract-content" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.251836 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="extract-content" Dec 02 10:06:00 crc kubenswrapper[4895]: E1202 10:06:00.251859 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="extract-utilities" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.251869 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="extract-utilities" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.252176 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="645f41e4-8317-434a-ba08-a760345778db" containerName="registry-server" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.254273 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.271345 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.400759 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.401363 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.401520 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6sz\" (UniqueName: \"kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.504321 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.504549 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.504588 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6sz\" (UniqueName: \"kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.504960 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.505284 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.528490 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6sz\" (UniqueName: \"kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz\") pod \"community-operators-zvpw9\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:00 crc kubenswrapper[4895]: I1202 10:06:00.574511 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:01 crc kubenswrapper[4895]: W1202 10:06:01.161869 4895 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod691cd083_a7a2_452b_a060_2c005542605c.slice/crio-7bb239537952f526367f81ddce423906b89ccd6aed8365c581376616d58aef5f WatchSource:0}: Error finding container 7bb239537952f526367f81ddce423906b89ccd6aed8365c581376616d58aef5f: Status 404 returned error can't find the container with id 7bb239537952f526367f81ddce423906b89ccd6aed8365c581376616d58aef5f Dec 02 10:06:01 crc kubenswrapper[4895]: I1202 10:06:01.174135 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:01 crc kubenswrapper[4895]: I1202 10:06:01.700195 4895 generic.go:334] "Generic (PLEG): container finished" podID="691cd083-a7a2-452b-a060-2c005542605c" containerID="9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99" exitCode=0 Dec 02 10:06:01 crc kubenswrapper[4895]: I1202 10:06:01.700256 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerDied","Data":"9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99"} Dec 02 10:06:01 crc kubenswrapper[4895]: I1202 10:06:01.700568 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerStarted","Data":"7bb239537952f526367f81ddce423906b89ccd6aed8365c581376616d58aef5f"} Dec 02 10:06:01 crc kubenswrapper[4895]: I1202 10:06:01.703868 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:06:03 crc kubenswrapper[4895]: I1202 10:06:03.722313 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerStarted","Data":"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46"} Dec 02 10:06:04 crc kubenswrapper[4895]: I1202 10:06:04.735790 4895 generic.go:334] "Generic (PLEG): container finished" podID="691cd083-a7a2-452b-a060-2c005542605c" containerID="762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46" exitCode=0 Dec 02 10:06:04 crc kubenswrapper[4895]: I1202 10:06:04.735886 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerDied","Data":"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46"} Dec 02 10:06:05 crc kubenswrapper[4895]: I1202 10:06:05.474003 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:06:05 crc kubenswrapper[4895]: I1202 10:06:05.474066 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:06:05 crc kubenswrapper[4895]: I1202 10:06:05.751521 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerStarted","Data":"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456"} Dec 02 10:06:05 crc kubenswrapper[4895]: I1202 10:06:05.779076 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zvpw9" podStartSLOduration=1.948619522 podStartE2EDuration="5.779055949s" podCreationTimestamp="2025-12-02 10:06:00 +0000 UTC" firstStartedPulling="2025-12-02 10:06:01.703645642 +0000 UTC m=+9772.874505255" lastFinishedPulling="2025-12-02 10:06:05.534082069 +0000 UTC m=+9776.704941682" observedRunningTime="2025-12-02 10:06:05.76976938 +0000 UTC m=+9776.940629003" watchObservedRunningTime="2025-12-02 10:06:05.779055949 +0000 UTC m=+9776.949915562" Dec 02 10:06:10 crc kubenswrapper[4895]: I1202 10:06:10.576189 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:10 crc kubenswrapper[4895]: I1202 10:06:10.577625 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:10 crc kubenswrapper[4895]: I1202 10:06:10.635774 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:10 crc kubenswrapper[4895]: I1202 10:06:10.843616 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:10 crc kubenswrapper[4895]: I1202 10:06:10.897288 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:12 crc kubenswrapper[4895]: I1202 10:06:12.819465 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zvpw9" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="registry-server" containerID="cri-o://6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456" gracePeriod=2 Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.802269 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.853688 4895 generic.go:334] "Generic (PLEG): container finished" podID="691cd083-a7a2-452b-a060-2c005542605c" containerID="6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456" exitCode=0 Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.853790 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerDied","Data":"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456"} Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.853835 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvpw9" event={"ID":"691cd083-a7a2-452b-a060-2c005542605c","Type":"ContainerDied","Data":"7bb239537952f526367f81ddce423906b89ccd6aed8365c581376616d58aef5f"} Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.853860 4895 scope.go:117] "RemoveContainer" containerID="6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.854037 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvpw9" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.879180 4895 scope.go:117] "RemoveContainer" containerID="762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.905446 4895 scope.go:117] "RemoveContainer" containerID="9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.942138 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content\") pod \"691cd083-a7a2-452b-a060-2c005542605c\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.942320 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x6sz\" (UniqueName: \"kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz\") pod \"691cd083-a7a2-452b-a060-2c005542605c\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.942361 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities\") pod \"691cd083-a7a2-452b-a060-2c005542605c\" (UID: \"691cd083-a7a2-452b-a060-2c005542605c\") " Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.944867 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities" (OuterVolumeSpecName: "utilities") pod "691cd083-a7a2-452b-a060-2c005542605c" (UID: "691cd083-a7a2-452b-a060-2c005542605c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.949047 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz" (OuterVolumeSpecName: "kube-api-access-9x6sz") pod "691cd083-a7a2-452b-a060-2c005542605c" (UID: "691cd083-a7a2-452b-a060-2c005542605c"). InnerVolumeSpecName "kube-api-access-9x6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.950678 4895 scope.go:117] "RemoveContainer" containerID="6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456" Dec 02 10:06:13 crc kubenswrapper[4895]: E1202 10:06:13.951145 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456\": container with ID starting with 6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456 not found: ID does not exist" containerID="6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.951195 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456"} err="failed to get container status \"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456\": rpc error: code = NotFound desc = could not find container \"6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456\": container with ID starting with 6548813f520d19611135fac568bb98207043fff45afd7002ea0cd829ac7ac456 not found: ID does not exist" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.951222 4895 scope.go:117] "RemoveContainer" containerID="762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46" Dec 02 10:06:13 crc kubenswrapper[4895]: E1202 10:06:13.951530 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46\": container with ID starting with 762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46 not found: ID does not exist" containerID="762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.951790 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46"} err="failed to get container status \"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46\": rpc error: code = NotFound desc = could not find container \"762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46\": container with ID starting with 762ec7bb0ab03eb3b0b0ee2d24246132c70ece6cfd32f05ee41386f34cd23c46 not found: ID does not exist" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.951817 4895 scope.go:117] "RemoveContainer" containerID="9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99" Dec 02 10:06:13 crc kubenswrapper[4895]: E1202 10:06:13.952173 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99\": container with ID starting with 9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99 not found: ID does not exist" containerID="9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99" Dec 02 10:06:13 crc kubenswrapper[4895]: I1202 10:06:13.952210 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99"} err="failed to get container status \"9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99\": rpc error: code = NotFound desc = could not find container \"9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99\": container with ID starting with 9bdbbdd83ba79af05b8db886c9b430a8030367913c938c7b3ff71ce730965f99 not found: ID does not exist" Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.034017 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "691cd083-a7a2-452b-a060-2c005542605c" (UID: "691cd083-a7a2-452b-a060-2c005542605c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.045110 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.045170 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x6sz\" (UniqueName: \"kubernetes.io/projected/691cd083-a7a2-452b-a060-2c005542605c-kube-api-access-9x6sz\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.045184 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/691cd083-a7a2-452b-a060-2c005542605c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.202405 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:14 crc kubenswrapper[4895]: I1202 10:06:14.212156 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zvpw9"] Dec 02 10:06:15 crc kubenswrapper[4895]: I1202 10:06:15.153191 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691cd083-a7a2-452b-a060-2c005542605c" path="/var/lib/kubelet/pods/691cd083-a7a2-452b-a060-2c005542605c/volumes" Dec 02 10:06:35 crc kubenswrapper[4895]: I1202 10:06:35.474011 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:06:35 crc kubenswrapper[4895]: I1202 10:06:35.474889 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:07:05 crc kubenswrapper[4895]: I1202 10:07:05.473085 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:07:05 crc kubenswrapper[4895]: I1202 10:07:05.473769 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:07:05 crc kubenswrapper[4895]: I1202 10:07:05.473822 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 10:07:05 crc kubenswrapper[4895]: I1202 10:07:05.474958 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:07:05 crc kubenswrapper[4895]: I1202 10:07:05.475015 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" gracePeriod=600 Dec 02 10:07:05 crc kubenswrapper[4895]: E1202 10:07:05.569675 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0468d2d1_a975_45a6_af9f_47adc432fab0.slice/crio-cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8.scope\": RecentStats: unable to find data in memory cache]" Dec 02 10:07:05 crc kubenswrapper[4895]: E1202 10:07:05.601238 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:07:06 crc kubenswrapper[4895]: I1202 10:07:06.519658 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" exitCode=0 Dec 02 10:07:06 crc kubenswrapper[4895]: I1202 10:07:06.519708 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8"} Dec 02 10:07:06 crc kubenswrapper[4895]: I1202 10:07:06.519762 4895 scope.go:117] "RemoveContainer" containerID="4a84913ab078ae3f7e63d3921159754fdc076f025af9fff01a588a2bf87f8d82" Dec 02 10:07:06 crc kubenswrapper[4895]: I1202 10:07:06.520710 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:07:06 crc kubenswrapper[4895]: E1202 10:07:06.521103 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:07:09 crc kubenswrapper[4895]: I1202 10:07:09.893818 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_5365a5b3-61a8-47cf-a99e-6425e6af3784/init-config-reloader/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.138548 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_5365a5b3-61a8-47cf-a99e-6425e6af3784/init-config-reloader/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.161859 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_5365a5b3-61a8-47cf-a99e-6425e6af3784/alertmanager/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.221712 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_5365a5b3-61a8-47cf-a99e-6425e6af3784/config-reloader/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.365661 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3bd61e8-c5af-493e-b789-d517f04a8f70/aodh-api/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.393285 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3bd61e8-c5af-493e-b789-d517f04a8f70/aodh-listener/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.465073 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3bd61e8-c5af-493e-b789-d517f04a8f70/aodh-evaluator/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.656719 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b3bd61e8-c5af-493e-b789-d517f04a8f70/aodh-notifier/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.677732 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-74489445b6-mqcrt_1b097c65-8c8e-435d-8f97-f717edd603a5/barbican-api/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.685046 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-74489445b6-mqcrt_1b097c65-8c8e-435d-8f97-f717edd603a5/barbican-api-log/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.922169 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d7f6798b6-7qknd_04b41b51-5490-42b8-9f37-837c9c8a3c2d/barbican-keystone-listener/0.log" Dec 02 10:07:10 crc kubenswrapper[4895]: I1202 10:07:10.974114 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d7f6798b6-7qknd_04b41b51-5490-42b8-9f37-837c9c8a3c2d/barbican-keystone-listener-log/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.172783 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cfc66c54c-2fd8f_3a651bae-3fe4-4805-8db1-ef32665084e8/barbican-worker/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.206254 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7cfc66c54c-2fd8f_3a651bae-3fe4-4805-8db1-ef32665084e8/barbican-worker-log/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.301341 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-wf5kx_f199c54a-28da-4ea4-a95b-4ab810484ce2/bootstrap-openstack-openstack-cell1/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.504537 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_41c2235d-9bee-4e2d-878b-e6f1471a4078/ceilometer-central-agent/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.545776 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_41c2235d-9bee-4e2d-878b-e6f1471a4078/proxy-httpd/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.557524 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_41c2235d-9bee-4e2d-878b-e6f1471a4078/ceilometer-notification-agent/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.695551 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_41c2235d-9bee-4e2d-878b-e6f1471a4078/sg-core/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.758988 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-2qtd2_64c58f3c-deb5-4931-9285-f02a3f576dd0/ceph-client-openstack-openstack-cell1/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.986480 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0fd9be26-fa07-4fd7-8723-f7e4121680d1/cinder-api/0.log" Dec 02 10:07:11 crc kubenswrapper[4895]: I1202 10:07:11.991254 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0fd9be26-fa07-4fd7-8723-f7e4121680d1/cinder-api-log/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.252189 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0d4fda23-3c3c-435b-860c-0973feb1e664/cinder-backup/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.315890 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0d4fda23-3c3c-435b-860c-0973feb1e664/probe/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.352211 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b78042eb-5d7c-4630-83e1-0f722cfde766/cinder-scheduler/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.555416 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b78042eb-5d7c-4630-83e1-0f722cfde766/probe/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.638978 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_11cd078b-e238-400e-bf0d-53e7ed0b848b/cinder-volume/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.723581 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_11cd078b-e238-400e-bf0d-53e7ed0b848b/probe/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.914036 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-vsmxx_2a1105f6-57ce-4e6d-a62b-1f1dbb777da8/configure-network-openstack-openstack-cell1/0.log" Dec 02 10:07:12 crc kubenswrapper[4895]: I1202 10:07:12.981538 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-z7wst_366b5800-e486-4b22-9e3a-4d0c86356cd0/configure-os-openstack-openstack-cell1/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.175017 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7864bfbdf-hszwn_c70683d8-a861-4c90-b092-41aad531f04e/init/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.512573 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7864bfbdf-hszwn_c70683d8-a861-4c90-b092-41aad531f04e/init/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.513792 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-pklz4_a4af478f-ce76-4ed3-9adc-93d1ae521565/download-cache-openstack-openstack-cell1/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.521034 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7864bfbdf-hszwn_c70683d8-a861-4c90-b092-41aad531f04e/dnsmasq-dns/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.731219 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2f09559d-29ed-400b-8069-4684c4d060cd/glance-httpd/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.795458 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2f09559d-29ed-400b-8069-4684c4d060cd/glance-log/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.810113 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f19bb559-c258-498f-9132-7ee9ea57db14/glance-httpd/0.log" Dec 02 10:07:13 crc kubenswrapper[4895]: I1202 10:07:13.972416 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f19bb559-c258-498f-9132-7ee9ea57db14/glance-log/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.178272 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7bf7577d5f-qmt6j_3a8f67f5-e0f7-4d77-a9e3-27edd04f368d/heat-api/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.288176 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6cf4c6755f-682sl_e8702398-a4fe-4a0b-94db-b747393df1a4/heat-cfnapi/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.451471 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5d65795cf7-2g5fx_98206baf-726c-4fa4-ae92-8048974e2e1d/heat-engine/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.635732 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66c85bfb6f-gnhhn_77721628-6ade-42aa-bce2-e4d481d4d76f/horizon-log/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.650842 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66c85bfb6f-gnhhn_77721628-6ade-42aa-bce2-e4d481d4d76f/horizon/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.722370 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-vvp2m_7c5c7124-dd91-4ff2-ada6-43bfd65dc9f5/install-certs-openstack-openstack-cell1/0.log" Dec 02 10:07:14 crc kubenswrapper[4895]: I1202 10:07:14.869524 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-kpnd9_56d195ca-712e-42b8-b755-ed605d04f09d/install-os-openstack-openstack-cell1/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.014300 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411101-7q6kv_7148462c-5f1b-4d1f-a161-5ffbf7963add/keystone-cron/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.123114 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db5ccf6f4-vpr49_9df75d68-0a6b-4fcb-8241-3ec3b95a0a1c/keystone-api/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.200598 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411161-xwnxx_5e89a9f9-bc16-4d4b-ae72-78209734db6f/keystone-cron/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.281686 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3b4b8f04-dca7-4b45-b66d-2a75b8c506cf/kube-state-metrics/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.449550 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-kd4nz_6b84d8a6-8098-46cd-83cf-860f21f040a0/libvirt-openstack-openstack-cell1/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.667481 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_858d88d6-a1a5-49dd-90c9-c87d83cc992f/manila-api-log/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.880395 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_858d88d6-a1a5-49dd-90c9-c87d83cc992f/manila-api/0.log" Dec 02 10:07:15 crc kubenswrapper[4895]: I1202 10:07:15.953944 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34e95488-e475-4fd5-94c7-43633883cc2b/probe/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.020140 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_34e95488-e475-4fd5-94c7-43633883cc2b/manila-scheduler/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.136252 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_294cd801-c423-4a14-95c0-1ece400a3760/manila-share/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.215350 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_294cd801-c423-4a14-95c0-1ece400a3760/probe/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.548772 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6849c5c4f-gh46j_f410d809-a0e1-465b-a495-868c22d9b9c7/neutron-httpd/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.552973 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6849c5c4f-gh46j_f410d809-a0e1-465b-a495-868c22d9b9c7/neutron-api/0.log" Dec 02 10:07:16 crc kubenswrapper[4895]: I1202 10:07:16.624703 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-qpf4d_40de1ee8-5c68-4155-86f4-55152e72d07e/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.028982 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-pl55w_ea563647-46a1-45f7-9592-c2f1a842df06/neutron-sriov-openstack-openstack-cell1/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.033327 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-dtfpx_5d8e5aa7-e827-4d5a-8d8f-d0ca211a00e4/neutron-metadata-openstack-openstack-cell1/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.243661 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cbc28dc3-fbee-4d3d-9c6c-88de443104cf/nova-api-api/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.420535 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cbc28dc3-fbee-4d3d-9c6c-88de443104cf/nova-api-log/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.478533 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f1ca8c61-df4d-45db-850a-7bd7dcd1eb70/nova-cell0-conductor-conductor/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.660853 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e926a7ab-fc54-4c41-9f38-65187a742aac/nova-cell1-conductor-conductor/0.log" Dec 02 10:07:17 crc kubenswrapper[4895]: I1202 10:07:17.787916 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_916b9d2f-42d2-4468-98b5-de64da9af5fc/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.070343 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellgkqcz_427eea9a-0bfb-4a1a-a225-c4264018fd13/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.186278 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-bmnqb_0f16ba88-ef94-4543-aef6-85263b26ff4c/nova-cell1-openstack-openstack-cell1/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.412115 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f095a39-138a-49b5-b50c-a37ad8adff98/nova-metadata-metadata/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.472568 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f095a39-138a-49b5-b50c-a37ad8adff98/nova-metadata-log/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.572110 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_29a50169-d52e-4f5e-afcc-bae4041237b5/nova-scheduler-scheduler/0.log" Dec 02 10:07:18 crc kubenswrapper[4895]: I1202 10:07:18.775252 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7b55b799f8-g962x_8cccbb55-a2bc-4b2a-af20-1987d11430f0/init/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.000530 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7b55b799f8-g962x_8cccbb55-a2bc-4b2a-af20-1987d11430f0/octavia-api-provider-agent/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.035247 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7b55b799f8-g962x_8cccbb55-a2bc-4b2a-af20-1987d11430f0/init/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.221970 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-88dkz_7898eb31-07a5-4f75-8646-237041e8d08e/init/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.258274 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-7b55b799f8-g962x_8cccbb55-a2bc-4b2a-af20-1987d11430f0/octavia-api/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.498449 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-88dkz_7898eb31-07a5-4f75-8646-237041e8d08e/init/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.608854 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-88dkz_7898eb31-07a5-4f75-8646-237041e8d08e/octavia-healthmanager/0.log" Dec 02 10:07:19 crc kubenswrapper[4895]: I1202 10:07:19.810216 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b4mmm_38e2512c-e02b-4088-a3ae-f979fb28e4b7/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.035964 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b4mmm_38e2512c-e02b-4088-a3ae-f979fb28e4b7/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.110941 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-b4mmm_38e2512c-e02b-4088-a3ae-f979fb28e4b7/octavia-housekeeping/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.122617 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-44qhk_e92475e9-c98d-4450-a3ff-d60ee780d43b/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.141488 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:07:20 crc kubenswrapper[4895]: E1202 10:07:20.143980 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.299163 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-44qhk_e92475e9-c98d-4450-a3ff-d60ee780d43b/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.318139 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-44qhk_e92475e9-c98d-4450-a3ff-d60ee780d43b/octavia-rsyslog/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.495384 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zkk7c_64a44515-da02-415f-9be2-5fcc1e976ff7/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.697944 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zkk7c_64a44515-da02-415f-9be2-5fcc1e976ff7/init/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.823679 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2857fca5-4863-4518-b69e-4ceeb0625fb5/mysql-bootstrap/0.log" Dec 02 10:07:20 crc kubenswrapper[4895]: I1202 10:07:20.977577 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-zkk7c_64a44515-da02-415f-9be2-5fcc1e976ff7/octavia-worker/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.033142 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2857fca5-4863-4518-b69e-4ceeb0625fb5/mysql-bootstrap/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.036879 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2857fca5-4863-4518-b69e-4ceeb0625fb5/galera/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.223306 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_340f7b33-817a-47bb-90f7-69a41144137d/mysql-bootstrap/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.411097 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_340f7b33-817a-47bb-90f7-69a41144137d/mysql-bootstrap/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.444968 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_23eecc3a-5577-4505-8ecc-768aaf5228e6/openstackclient/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.470124 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_340f7b33-817a-47bb-90f7-69a41144137d/galera/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.754042 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-fp6lz_6389707d-0e93-4457-ae41-4da59350383e/ovn-controller/0.log" Dec 02 10:07:21 crc kubenswrapper[4895]: I1202 10:07:21.870354 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dcddm_eb8d0991-333c-44a2-a646-14665184fb94/openstack-network-exporter/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.048570 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kgxnn_2b3fbe76-b9cc-402f-9f1b-46d64c057d31/ovsdb-server-init/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.319279 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kgxnn_2b3fbe76-b9cc-402f-9f1b-46d64c057d31/ovsdb-server/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.320491 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kgxnn_2b3fbe76-b9cc-402f-9f1b-46d64c057d31/ovsdb-server-init/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.391139 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kgxnn_2b3fbe76-b9cc-402f-9f1b-46d64c057d31/ovs-vswitchd/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.556057 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7ee53cb1-8853-4c8d-a58a-beba1d7cbea4/openstack-network-exporter/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.571994 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7ee53cb1-8853-4c8d-a58a-beba1d7cbea4/ovn-northd/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.821557 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a799ab29-f667-4d1f-af0f-9d0123379f79/openstack-network-exporter/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.829251 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-7tr28_ea628598-f396-4f21-b672-9779a9b04dd1/ovn-openstack-openstack-cell1/0.log" Dec 02 10:07:22 crc kubenswrapper[4895]: I1202 10:07:22.936203 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a799ab29-f667-4d1f-af0f-9d0123379f79/ovsdbserver-nb/0.log" Dec 02 10:07:23 crc kubenswrapper[4895]: I1202 10:07:23.905155 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2e1d668e-3d4a-43a1-9fa4-8a1f478aa316/openstack-network-exporter/0.log" Dec 02 10:07:23 crc kubenswrapper[4895]: I1202 10:07:23.938586 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_2e1d668e-3d4a-43a1-9fa4-8a1f478aa316/ovsdbserver-nb/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.175670 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f6a3fdc9-294e-403e-b1b3-178a47b3c692/openstack-network-exporter/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.238305 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f6a3fdc9-294e-403e-b1b3-178a47b3c692/ovsdbserver-nb/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.339393 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e09ee80b-8154-4dfb-8dd9-df40a3aded0a/openstack-network-exporter/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.427351 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e09ee80b-8154-4dfb-8dd9-df40a3aded0a/ovsdbserver-sb/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.568970 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_36385a11-0e9f-41c4-a386-ff3710a53b75/openstack-network-exporter/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.725723 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_36385a11-0e9f-41c4-a386-ff3710a53b75/ovsdbserver-sb/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.789945 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_78b393f8-7861-4b27-af3d-4a70cd2afa7e/openstack-network-exporter/0.log" Dec 02 10:07:24 crc kubenswrapper[4895]: I1202 10:07:24.827131 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_78b393f8-7861-4b27-af3d-4a70cd2afa7e/ovsdbserver-sb/0.log" Dec 02 10:07:25 crc kubenswrapper[4895]: I1202 10:07:25.058910 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-857448b6bd-q99p5_d07803fb-bcf8-4411-9f7e-b2ca58361b51/placement-api/0.log" Dec 02 10:07:25 crc kubenswrapper[4895]: I1202 10:07:25.162891 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-857448b6bd-q99p5_d07803fb-bcf8-4411-9f7e-b2ca58361b51/placement-log/0.log" Dec 02 10:07:25 crc kubenswrapper[4895]: I1202 10:07:25.739796 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c9rwhk_e67687a6-5862-4747-ae07-1bd20e752c11/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 02 10:07:25 crc kubenswrapper[4895]: I1202 10:07:25.760623 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_38a14d48-8eb7-44be-b29b-5a8574b72d91/init-config-reloader/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.051420 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_38a14d48-8eb7-44be-b29b-5a8574b72d91/config-reloader/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.095981 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_38a14d48-8eb7-44be-b29b-5a8574b72d91/prometheus/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.191539 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_38a14d48-8eb7-44be-b29b-5a8574b72d91/thanos-sidecar/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.208461 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_38a14d48-8eb7-44be-b29b-5a8574b72d91/init-config-reloader/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.417211 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc945ba-c10f-4460-a3ed-e075da154b6a/setup-container/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.581519 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc945ba-c10f-4460-a3ed-e075da154b6a/setup-container/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.645661 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a551304-11d9-432c-bd8d-074239ed81c9/setup-container/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.694309 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4fc945ba-c10f-4460-a3ed-e075da154b6a/rabbitmq/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.902148 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a551304-11d9-432c-bd8d-074239ed81c9/setup-container/0.log" Dec 02 10:07:26 crc kubenswrapper[4895]: I1202 10:07:26.954121 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-xj8ll_fa246d81-1464-4069-9a3a-40b53b72e55f/reboot-os-openstack-openstack-cell1/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.137579 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1a551304-11d9-432c-bd8d-074239ed81c9/rabbitmq/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.201625 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-j9gdl_afc474d8-721b-479d-a10b-adfa2455b1fb/run-os-openstack-openstack-cell1/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.384794 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-7vsz6_3834bc1f-18a0-4d57-8f0d-e5150bd51186/ssh-known-hosts-openstack/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.689217 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-j4z2k_fdefa681-0c42-4f77-81e9-19fc3ae7a940/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.825571 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-4h564_7fb97644-658c-4072-8e03-a89589d95cf5/validate-network-openstack-openstack-cell1/0.log" Dec 02 10:07:27 crc kubenswrapper[4895]: I1202 10:07:27.867039 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-8tpkb_2cf8bdc2-2981-4069-aa5c-de35a6d4a246/telemetry-openstack-openstack-cell1/0.log" Dec 02 10:07:28 crc kubenswrapper[4895]: I1202 10:07:28.041352 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a4b2f1ac-db64-4f2e-8d51-8470c6d1e4f9/memcached/0.log" Dec 02 10:07:33 crc kubenswrapper[4895]: I1202 10:07:33.140943 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:07:33 crc kubenswrapper[4895]: E1202 10:07:33.141930 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:07:44 crc kubenswrapper[4895]: I1202 10:07:44.142074 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:07:44 crc kubenswrapper[4895]: E1202 10:07:44.142749 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:07:46 crc kubenswrapper[4895]: E1202 10:07:46.954031 4895 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 02 10:07:49 crc kubenswrapper[4895]: I1202 10:07:49.913878 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w52df_73f50459-103c-461c-a71a-95e93d66c4c2/kube-rbac-proxy/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.104264 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w52df_73f50459-103c-461c-a71a-95e93d66c4c2/manager/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.127358 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-74cpb_582c057b-7217-47bf-b2d7-f691861668c3/kube-rbac-proxy/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.306139 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-74cpb_582c057b-7217-47bf-b2d7-f691861668c3/manager/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.311845 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-m8cp2_f6776d5f-3c3e-48b5-a6fd-30ff153345c2/kube-rbac-proxy/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.393114 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-m8cp2_f6776d5f-3c3e-48b5-a6fd-30ff153345c2/manager/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.504055 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/util/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.737361 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/util/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.766505 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/pull/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.797804 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/pull/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.925222 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/util/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.974819 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/pull/0.log" Dec 02 10:07:50 crc kubenswrapper[4895]: I1202 10:07:50.975026 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eea05993eef6bd02b339f405fe470a4999c2b40f7a894619b42f64e0e1t7x28_cf843b0e-9464-4fc2-9121-d1b1128c439f/extract/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.163185 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8gjq_362234fb-b096-48e0-9be1-bed6b3e1dcf6/kube-rbac-proxy/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.287227 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mgtj6_442a4a4d-98fb-4869-9418-7f8f3ff4644b/kube-rbac-proxy/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.356893 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8gjq_362234fb-b096-48e0-9be1-bed6b3e1dcf6/manager/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.405964 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-mgtj6_442a4a4d-98fb-4869-9418-7f8f3ff4644b/manager/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.500827 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-sbrp2_806c9a5b-16ad-499d-8625-ec9124baca56/kube-rbac-proxy/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.572669 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-sbrp2_806c9a5b-16ad-499d-8625-ec9124baca56/manager/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.647229 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-wmsb6_86fe6ea0-2ba9-46f8-9a71-1b990d841e31/kube-rbac-proxy/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.921542 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-cp9zj_4c5e704b-8d64-4341-abfe-da2df788ba5c/manager/0.log" Dec 02 10:07:51 crc kubenswrapper[4895]: I1202 10:07:51.930458 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-cp9zj_4c5e704b-8d64-4341-abfe-da2df788ba5c/kube-rbac-proxy/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.042669 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-wmsb6_86fe6ea0-2ba9-46f8-9a71-1b990d841e31/manager/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.122403 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-z6fb4_cb79c25e-42b0-4c89-b756-89d97afeea8a/kube-rbac-proxy/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.284349 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-z6fb4_cb79c25e-42b0-4c89-b756-89d97afeea8a/manager/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.350217 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5xsjx_170932c6-4350-4209-ba99-ff53eecd81ee/kube-rbac-proxy/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.430435 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-5xsjx_170932c6-4350-4209-ba99-ff53eecd81ee/manager/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.542969 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-qxbgv_392adf20-0169-4258-9f4f-bb293bd5f8e8/kube-rbac-proxy/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.585275 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-qxbgv_392adf20-0169-4258-9f4f-bb293bd5f8e8/manager/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.802117 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9zczx_ca1b5423-1f2c-4b12-9ae9-f65bb5301c51/kube-rbac-proxy/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.859060 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9zczx_ca1b5423-1f2c-4b12-9ae9-f65bb5301c51/manager/0.log" Dec 02 10:07:52 crc kubenswrapper[4895]: I1202 10:07:52.962681 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-4hpbt_2a05da0d-5cc8-4656-8cf4-96b96077d708/kube-rbac-proxy/0.log" Dec 02 10:07:53 crc kubenswrapper[4895]: I1202 10:07:53.136398 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2mrjr_98c4ec72-ccff-439b-af96-53775411d965/kube-rbac-proxy/0.log" Dec 02 10:07:53 crc kubenswrapper[4895]: I1202 10:07:53.258225 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-2mrjr_98c4ec72-ccff-439b-af96-53775411d965/manager/0.log" Dec 02 10:07:53 crc kubenswrapper[4895]: I1202 10:07:53.353401 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-4hpbt_2a05da0d-5cc8-4656-8cf4-96b96077d708/manager/0.log" Dec 02 10:07:53 crc kubenswrapper[4895]: I1202 10:07:53.534654 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4c299z_b8dc2edd-3bab-4a5d-a994-ba2212e85045/kube-rbac-proxy/0.log" Dec 02 10:07:53 crc kubenswrapper[4895]: I1202 10:07:53.603551 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4c299z_b8dc2edd-3bab-4a5d-a994-ba2212e85045/manager/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.046182 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wg797_87c87c33-2ac6-4223-870c-aa91961f9952/registry-server/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.061761 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6d58ccd9c-c8j68_63cf2176-3acd-461d-9fda-3f2337a37452/operator/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.144569 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4642q_f4639bb3-56b6-498a-bb7d-ab26b46fe806/kube-rbac-proxy/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.397707 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q6n97_991002a6-abdd-43e4-b22d-1d95383d3b96/kube-rbac-proxy/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.488912 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4642q_f4639bb3-56b6-498a-bb7d-ab26b46fe806/manager/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.625972 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q6n97_991002a6-abdd-43e4-b22d-1d95383d3b96/manager/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.698203 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bhzzl_82a2bf22-3682-4982-b4fc-87ac78873cce/operator/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.877929 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-rqm56_1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c/manager/0.log" Dec 02 10:07:54 crc kubenswrapper[4895]: I1202 10:07:54.924489 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-rqm56_1df4be5d-9e12-4799-aaa8-1ec5bfa11a2c/kube-rbac-proxy/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.014666 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hcnp5_ad99804b-869f-4b42-89ab-d29341434b61/kube-rbac-proxy/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.173071 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6x68d_a5a01f83-cddf-479d-b6d0-7944d70c0bdd/kube-rbac-proxy/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.250681 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-6x68d_a5a01f83-cddf-479d-b6d0-7944d70c0bdd/manager/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.365483 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-hcnp5_ad99804b-869f-4b42-89ab-d29341434b61/manager/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.449965 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-d5lnh_a2cb057c-0f4a-4220-8666-3ccab3458be2/kube-rbac-proxy/0.log" Dec 02 10:07:55 crc kubenswrapper[4895]: I1202 10:07:55.529693 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-d5lnh_a2cb057c-0f4a-4220-8666-3ccab3458be2/manager/0.log" Dec 02 10:07:56 crc kubenswrapper[4895]: I1202 10:07:56.231054 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6979866f9f-j56w9_932b16cb-babd-4cd7-902c-03cd223e98bc/manager/0.log" Dec 02 10:07:57 crc kubenswrapper[4895]: I1202 10:07:57.142462 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:07:57 crc kubenswrapper[4895]: E1202 10:07:57.142972 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:09 crc kubenswrapper[4895]: I1202 10:08:09.151257 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:08:09 crc kubenswrapper[4895]: E1202 10:08:09.152176 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:16 crc kubenswrapper[4895]: I1202 10:08:16.407958 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-bvgr6_c27769e4-b2f8-4947-96c9-b90bfce6ff0d/control-plane-machine-set-operator/0.log" Dec 02 10:08:16 crc kubenswrapper[4895]: I1202 10:08:16.634955 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-49t7q_1569ea0e-ca30-4212-95e4-11dde6bca970/kube-rbac-proxy/0.log" Dec 02 10:08:16 crc kubenswrapper[4895]: I1202 10:08:16.636341 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-49t7q_1569ea0e-ca30-4212-95e4-11dde6bca970/machine-api-operator/0.log" Dec 02 10:08:20 crc kubenswrapper[4895]: I1202 10:08:20.140949 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:08:20 crc kubenswrapper[4895]: E1202 10:08:20.141804 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:29 crc kubenswrapper[4895]: I1202 10:08:29.887691 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-t679q_0dc59886-5d5a-4d16-a083-8a14503368fc/cert-manager-controller/0.log" Dec 02 10:08:30 crc kubenswrapper[4895]: I1202 10:08:30.069388 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-mqb8r_8e031490-b855-4e4c-8159-14cf9a710e98/cert-manager-cainjector/0.log" Dec 02 10:08:30 crc kubenswrapper[4895]: I1202 10:08:30.097000 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-p2gqj_6f60e46b-cecc-41fc-992a-0b4ae09082fd/cert-manager-webhook/0.log" Dec 02 10:08:33 crc kubenswrapper[4895]: I1202 10:08:33.141564 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:08:33 crc kubenswrapper[4895]: E1202 10:08:33.142378 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:43 crc kubenswrapper[4895]: I1202 10:08:43.769143 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-t5dk9_5f9c69d9-5599-4fc7-bebf-44167ee2cccf/nmstate-console-plugin/0.log" Dec 02 10:08:43 crc kubenswrapper[4895]: I1202 10:08:43.775210 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lmvxq_b66d1e69-d965-457a-8a57-b5b721bc3cd9/nmstate-handler/0.log" Dec 02 10:08:43 crc kubenswrapper[4895]: I1202 10:08:43.952142 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-55nh2_b59c4d17-fb33-40a9-b00f-fc89b30d9c6a/kube-rbac-proxy/0.log" Dec 02 10:08:43 crc kubenswrapper[4895]: I1202 10:08:43.955321 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-55nh2_b59c4d17-fb33-40a9-b00f-fc89b30d9c6a/nmstate-metrics/0.log" Dec 02 10:08:44 crc kubenswrapper[4895]: I1202 10:08:44.129176 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-67fqr_2586c411-cce4-4ade-af50-5d2b0c5ee2b6/nmstate-operator/0.log" Dec 02 10:08:44 crc kubenswrapper[4895]: I1202 10:08:44.150795 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-rfdrr_19c56e99-f344-469b-9940-3e8ebe40c721/nmstate-webhook/0.log" Dec 02 10:08:45 crc kubenswrapper[4895]: I1202 10:08:45.141460 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:08:45 crc kubenswrapper[4895]: E1202 10:08:45.142051 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:58 crc kubenswrapper[4895]: I1202 10:08:58.141937 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:08:58 crc kubenswrapper[4895]: E1202 10:08:58.142883 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.183102 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-mpkw4_b2bd91a3-f8a0-4abe-9598-4977cc56daa1/kube-rbac-proxy/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.528428 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-frr-files/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.631105 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-mpkw4_b2bd91a3-f8a0-4abe-9598-4977cc56daa1/controller/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.674178 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-reloader/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.736995 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-frr-files/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.744279 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-metrics/0.log" Dec 02 10:08:59 crc kubenswrapper[4895]: I1202 10:08:59.870309 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-reloader/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.059395 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-reloader/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.065012 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-frr-files/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.125461 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-metrics/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.133214 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-metrics/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.921965 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-frr-files/0.log" Dec 02 10:09:00 crc kubenswrapper[4895]: I1202 10:09:00.980971 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-reloader/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.007545 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/cp-metrics/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.040242 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/controller/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.220084 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/frr-metrics/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.235982 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/kube-rbac-proxy/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.344824 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/kube-rbac-proxy-frr/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.497791 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/reloader/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.620970 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-h27ch_d6b6bc7e-1d2f-4575-b0fc-a605dcfff0af/frr-k8s-webhook-server/0.log" Dec 02 10:09:01 crc kubenswrapper[4895]: I1202 10:09:01.788083 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5649cff7-xm5vt_50763e18-0c0b-4aff-97bc-0fb2fdce0b0b/manager/0.log" Dec 02 10:09:02 crc kubenswrapper[4895]: I1202 10:09:02.518786 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tgv9h_1a93aadf-71d0-4a54-8eb1-fd710b164b07/kube-rbac-proxy/0.log" Dec 02 10:09:02 crc kubenswrapper[4895]: I1202 10:09:02.589091 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-659846876d-j9nsg_bbabc224-1401-4730-a4d7-92caa322c81b/webhook-server/0.log" Dec 02 10:09:04 crc kubenswrapper[4895]: I1202 10:09:04.146246 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tgv9h_1a93aadf-71d0-4a54-8eb1-fd710b164b07/speaker/0.log" Dec 02 10:09:04 crc kubenswrapper[4895]: I1202 10:09:04.996004 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hjl4p_bef27c67-d150-4004-bfbf-285c544f72f7/frr/0.log" Dec 02 10:09:09 crc kubenswrapper[4895]: I1202 10:09:09.157263 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:09:09 crc kubenswrapper[4895]: E1202 10:09:09.158241 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.000885 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/util/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.358706 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/util/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.469161 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/util/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.511633 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/pull/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.513983 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/pull/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.605021 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/pull/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.684926 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931acblbt_28ac482a-ea9e-4c36-93b7-89580756a458/extract/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.714156 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/util/0.log" Dec 02 10:09:18 crc kubenswrapper[4895]: I1202 10:09:18.995600 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/util/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.013645 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/pull/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.015586 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/pull/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.255261 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/pull/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.306366 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/util/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.333468 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8ch7_3a467b75-3cba-434a-aa67-c823cb289396/extract/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.533893 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/util/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.761389 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/util/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.773534 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/pull/0.log" Dec 02 10:09:19 crc kubenswrapper[4895]: I1202 10:09:19.794102 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/pull/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.001351 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/extract/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.021893 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/pull/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.023292 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92107ctgq_223b0e53-4f79-4ffb-bf12-38b19193e535/util/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.214444 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/util/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.457544 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/util/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.466183 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/pull/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.487620 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/pull/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.724539 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/pull/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.725623 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/extract/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.767185 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83ncvhw_4ffd4159-d58c-4f5b-aa31-bb4e81790c51/util/0.log" Dec 02 10:09:20 crc kubenswrapper[4895]: I1202 10:09:20.905543 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-utilities/0.log" Dec 02 10:09:21 crc kubenswrapper[4895]: I1202 10:09:21.772337 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-utilities/0.log" Dec 02 10:09:21 crc kubenswrapper[4895]: I1202 10:09:21.779625 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-content/0.log" Dec 02 10:09:21 crc kubenswrapper[4895]: I1202 10:09:21.786899 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-content/0.log" Dec 02 10:09:21 crc kubenswrapper[4895]: I1202 10:09:21.968255 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-utilities/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.041230 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/extract-content/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.243909 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-utilities/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.498818 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-content/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.535133 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-content/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.565636 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-utilities/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.796016 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-utilities/0.log" Dec 02 10:09:22 crc kubenswrapper[4895]: I1202 10:09:22.811064 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/extract-content/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.049543 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pgd7z_755e130f-ee77-449d-92fe-2ca53ab52fbd/registry-server/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.049948 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qq7dd_a5fe9776-f107-4203-9298-a7a94665cdb4/marketplace-operator/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.141426 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:09:23 crc kubenswrapper[4895]: E1202 10:09:23.141715 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.576672 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-utilities/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.578697 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jcdj8_e3f4e1c5-a3e3-4391-be72-d2f2b908da65/registry-server/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.770471 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-content/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.777170 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-content/0.log" Dec 02 10:09:23 crc kubenswrapper[4895]: I1202 10:09:23.800219 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-utilities/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.056492 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-utilities/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.067189 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-content/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.091656 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/extract-utilities/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.252780 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-content/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.309453 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-content/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.348719 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-utilities/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.365102 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xzs6v_b60cfc9e-fdbe-4373-9c07-9db6a265b945/registry-server/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.576680 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-content/0.log" Dec 02 10:09:24 crc kubenswrapper[4895]: I1202 10:09:24.616637 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/extract-utilities/0.log" Dec 02 10:09:25 crc kubenswrapper[4895]: I1202 10:09:25.780718 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c6tgt_a20409dc-2f9e-4c3b-b83e-11d31404503a/registry-server/0.log" Dec 02 10:09:35 crc kubenswrapper[4895]: I1202 10:09:35.141877 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:09:35 crc kubenswrapper[4895]: E1202 10:09:35.142697 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:09:38 crc kubenswrapper[4895]: I1202 10:09:38.560270 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-sll8w_28dc3134-e709-42e9-b347-b429f8404b0b/prometheus-operator/0.log" Dec 02 10:09:38 crc kubenswrapper[4895]: I1202 10:09:38.790077 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8f5f9fff9-thzt2_4b967677-6953-486c-96f5-8e8d2b7b4735/prometheus-operator-admission-webhook/0.log" Dec 02 10:09:38 crc kubenswrapper[4895]: I1202 10:09:38.806555 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8f5f9fff9-sngvl_be63c50f-1ea7-4abe-92af-065880aa82bc/prometheus-operator-admission-webhook/0.log" Dec 02 10:09:38 crc kubenswrapper[4895]: I1202 10:09:38.989807 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-2v4cs_42557ee6-0d59-41ae-a224-6f2d6aaac16e/operator/0.log" Dec 02 10:09:39 crc kubenswrapper[4895]: I1202 10:09:39.039662 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-zgdhw_20621657-8f32-4666-b066-24cd34782010/perses-operator/0.log" Dec 02 10:09:46 crc kubenswrapper[4895]: I1202 10:09:46.141043 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:09:46 crc kubenswrapper[4895]: E1202 10:09:46.141851 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:10:00 crc kubenswrapper[4895]: I1202 10:10:00.140865 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:10:00 crc kubenswrapper[4895]: E1202 10:10:00.141722 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:10:14 crc kubenswrapper[4895]: I1202 10:10:14.141409 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:10:14 crc kubenswrapper[4895]: E1202 10:10:14.142286 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:10:28 crc kubenswrapper[4895]: I1202 10:10:28.141864 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:10:28 crc kubenswrapper[4895]: E1202 10:10:28.142697 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:10:43 crc kubenswrapper[4895]: I1202 10:10:43.141254 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:10:43 crc kubenswrapper[4895]: E1202 10:10:43.143980 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:10:58 crc kubenswrapper[4895]: I1202 10:10:58.141568 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:10:58 crc kubenswrapper[4895]: E1202 10:10:58.142484 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:11:12 crc kubenswrapper[4895]: I1202 10:11:12.141294 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:11:12 crc kubenswrapper[4895]: E1202 10:11:12.142112 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.759228 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4cvfk"] Dec 02 10:11:19 crc kubenswrapper[4895]: E1202 10:11:19.760179 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="extract-content" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.760193 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="extract-content" Dec 02 10:11:19 crc kubenswrapper[4895]: E1202 10:11:19.760233 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="extract-utilities" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.760241 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="extract-utilities" Dec 02 10:11:19 crc kubenswrapper[4895]: E1202 10:11:19.760258 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="registry-server" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.760263 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="registry-server" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.760492 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="691cd083-a7a2-452b-a060-2c005542605c" containerName="registry-server" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.762059 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.772686 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cvfk"] Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.850398 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-utilities\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.850492 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5hdn\" (UniqueName: \"kubernetes.io/projected/86484183-15b9-472c-9a69-07bda7d475e6-kube-api-access-v5hdn\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.850579 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-catalog-content\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.952782 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-utilities\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.952860 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5hdn\" (UniqueName: \"kubernetes.io/projected/86484183-15b9-472c-9a69-07bda7d475e6-kube-api-access-v5hdn\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.952913 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-catalog-content\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.953368 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-utilities\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.953481 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86484183-15b9-472c-9a69-07bda7d475e6-catalog-content\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:19 crc kubenswrapper[4895]: I1202 10:11:19.974619 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5hdn\" (UniqueName: \"kubernetes.io/projected/86484183-15b9-472c-9a69-07bda7d475e6-kube-api-access-v5hdn\") pod \"certified-operators-4cvfk\" (UID: \"86484183-15b9-472c-9a69-07bda7d475e6\") " pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:20 crc kubenswrapper[4895]: I1202 10:11:20.099807 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:20 crc kubenswrapper[4895]: I1202 10:11:20.724124 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cvfk"] Dec 02 10:11:21 crc kubenswrapper[4895]: I1202 10:11:21.587998 4895 generic.go:334] "Generic (PLEG): container finished" podID="86484183-15b9-472c-9a69-07bda7d475e6" containerID="4753366a4711fda212c18fc2403b9c79bc5531115ad401f86e281778607a1ab4" exitCode=0 Dec 02 10:11:21 crc kubenswrapper[4895]: I1202 10:11:21.588278 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cvfk" event={"ID":"86484183-15b9-472c-9a69-07bda7d475e6","Type":"ContainerDied","Data":"4753366a4711fda212c18fc2403b9c79bc5531115ad401f86e281778607a1ab4"} Dec 02 10:11:21 crc kubenswrapper[4895]: I1202 10:11:21.588307 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cvfk" event={"ID":"86484183-15b9-472c-9a69-07bda7d475e6","Type":"ContainerStarted","Data":"82c47989380b4867fcd1f48b80ce5d1c9745a10733bc3dc66d38dada9e9effd5"} Dec 02 10:11:21 crc kubenswrapper[4895]: I1202 10:11:21.590482 4895 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 10:11:27 crc kubenswrapper[4895]: I1202 10:11:27.149144 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:11:27 crc kubenswrapper[4895]: E1202 10:11:27.150194 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:11:27 crc kubenswrapper[4895]: I1202 10:11:27.651615 4895 generic.go:334] "Generic (PLEG): container finished" podID="86484183-15b9-472c-9a69-07bda7d475e6" containerID="8cae5c9c4a169f34c656c84a32d68dddc687b2c1c058c7c38cfe4fe3bb718830" exitCode=0 Dec 02 10:11:27 crc kubenswrapper[4895]: I1202 10:11:27.651703 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cvfk" event={"ID":"86484183-15b9-472c-9a69-07bda7d475e6","Type":"ContainerDied","Data":"8cae5c9c4a169f34c656c84a32d68dddc687b2c1c058c7c38cfe4fe3bb718830"} Dec 02 10:11:28 crc kubenswrapper[4895]: I1202 10:11:28.662906 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cvfk" event={"ID":"86484183-15b9-472c-9a69-07bda7d475e6","Type":"ContainerStarted","Data":"fe149668251ae2fb8240d14e59d07da973de3b6fc6eacc043ecaf76bdf55db4a"} Dec 02 10:11:28 crc kubenswrapper[4895]: I1202 10:11:28.698764 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4cvfk" podStartSLOduration=3.019053479 podStartE2EDuration="9.69872991s" podCreationTimestamp="2025-12-02 10:11:19 +0000 UTC" firstStartedPulling="2025-12-02 10:11:21.590181481 +0000 UTC m=+10092.761041094" lastFinishedPulling="2025-12-02 10:11:28.269857912 +0000 UTC m=+10099.440717525" observedRunningTime="2025-12-02 10:11:28.686665684 +0000 UTC m=+10099.857525297" watchObservedRunningTime="2025-12-02 10:11:28.69872991 +0000 UTC m=+10099.869589533" Dec 02 10:11:30 crc kubenswrapper[4895]: I1202 10:11:30.099958 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:30 crc kubenswrapper[4895]: I1202 10:11:30.100025 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:30 crc kubenswrapper[4895]: I1202 10:11:30.149457 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.142071 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:11:40 crc kubenswrapper[4895]: E1202 10:11:40.142902 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.160391 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4cvfk" Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.243633 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cvfk"] Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.298790 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.299106 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcdj8" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="registry-server" containerID="cri-o://11aa2ac12747d944fd956e9be2daf5fcddbb8ec86d165a31d87106882d869cf8" gracePeriod=2 Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.789514 4895 generic.go:334] "Generic (PLEG): container finished" podID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerID="11aa2ac12747d944fd956e9be2daf5fcddbb8ec86d165a31d87106882d869cf8" exitCode=0 Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.789619 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerDied","Data":"11aa2ac12747d944fd956e9be2daf5fcddbb8ec86d165a31d87106882d869cf8"} Dec 02 10:11:40 crc kubenswrapper[4895]: I1202 10:11:40.918512 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.071696 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85h5p\" (UniqueName: \"kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p\") pod \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.071762 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities\") pod \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.072124 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content\") pod \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\" (UID: \"e3f4e1c5-a3e3-4391-be72-d2f2b908da65\") " Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.076088 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities" (OuterVolumeSpecName: "utilities") pod "e3f4e1c5-a3e3-4391-be72-d2f2b908da65" (UID: "e3f4e1c5-a3e3-4391-be72-d2f2b908da65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.089940 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p" (OuterVolumeSpecName: "kube-api-access-85h5p") pod "e3f4e1c5-a3e3-4391-be72-d2f2b908da65" (UID: "e3f4e1c5-a3e3-4391-be72-d2f2b908da65"). InnerVolumeSpecName "kube-api-access-85h5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.153936 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3f4e1c5-a3e3-4391-be72-d2f2b908da65" (UID: "e3f4e1c5-a3e3-4391-be72-d2f2b908da65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.174235 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.174259 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85h5p\" (UniqueName: \"kubernetes.io/projected/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-kube-api-access-85h5p\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.174272 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3f4e1c5-a3e3-4391-be72-d2f2b908da65-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.805436 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcdj8" event={"ID":"e3f4e1c5-a3e3-4391-be72-d2f2b908da65","Type":"ContainerDied","Data":"c1c5a83db58e2b70841386dc7c60f39463b82ae8808d54e46d5d36f29093f12a"} Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.805489 4895 scope.go:117] "RemoveContainer" containerID="11aa2ac12747d944fd956e9be2daf5fcddbb8ec86d165a31d87106882d869cf8" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.805502 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcdj8" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.829580 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.836915 4895 scope.go:117] "RemoveContainer" containerID="1b060752968384cf6255b356cbdd446b2ab2551d42b0ba68c2077ef36d86e936" Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.838253 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcdj8"] Dec 02 10:11:41 crc kubenswrapper[4895]: I1202 10:11:41.856647 4895 scope.go:117] "RemoveContainer" containerID="9f45aa1d67b4452b880db8b0f42a0c39c5451c04383dc5c4c6c746602c1175d4" Dec 02 10:11:43 crc kubenswrapper[4895]: I1202 10:11:43.159377 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" path="/var/lib/kubelet/pods/e3f4e1c5-a3e3-4391-be72-d2f2b908da65/volumes" Dec 02 10:11:54 crc kubenswrapper[4895]: I1202 10:11:54.011229 4895 generic.go:334] "Generic (PLEG): container finished" podID="93be2c72-5575-4d6c-adcd-0714307ce225" containerID="266b7ba0e11efa3164afc7a0af329cc069b5de1b34e3ce72a87a889be48bea66" exitCode=0 Dec 02 10:11:54 crc kubenswrapper[4895]: I1202 10:11:54.011282 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldm68/must-gather-86887" event={"ID":"93be2c72-5575-4d6c-adcd-0714307ce225","Type":"ContainerDied","Data":"266b7ba0e11efa3164afc7a0af329cc069b5de1b34e3ce72a87a889be48bea66"} Dec 02 10:11:54 crc kubenswrapper[4895]: I1202 10:11:54.012499 4895 scope.go:117] "RemoveContainer" containerID="266b7ba0e11efa3164afc7a0af329cc069b5de1b34e3ce72a87a889be48bea66" Dec 02 10:11:54 crc kubenswrapper[4895]: I1202 10:11:54.141096 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:11:54 crc kubenswrapper[4895]: E1202 10:11:54.141646 4895 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wfcg7_openshift-machine-config-operator(0468d2d1-a975-45a6-af9f-47adc432fab0)\"" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" Dec 02 10:11:54 crc kubenswrapper[4895]: I1202 10:11:54.487200 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldm68_must-gather-86887_93be2c72-5575-4d6c-adcd-0714307ce225/gather/0.log" Dec 02 10:12:02 crc kubenswrapper[4895]: I1202 10:12:02.729330 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ldm68/must-gather-86887"] Dec 02 10:12:02 crc kubenswrapper[4895]: I1202 10:12:02.729968 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ldm68/must-gather-86887" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="copy" containerID="cri-o://5653142cb06c62207dc7036f57ae60397c136dc2f5827ad85d8267faa96f3beb" gracePeriod=2 Dec 02 10:12:02 crc kubenswrapper[4895]: I1202 10:12:02.743340 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ldm68/must-gather-86887"] Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.113130 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldm68_must-gather-86887_93be2c72-5575-4d6c-adcd-0714307ce225/copy/0.log" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.114087 4895 generic.go:334] "Generic (PLEG): container finished" podID="93be2c72-5575-4d6c-adcd-0714307ce225" containerID="5653142cb06c62207dc7036f57ae60397c136dc2f5827ad85d8267faa96f3beb" exitCode=143 Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.268464 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldm68_must-gather-86887_93be2c72-5575-4d6c-adcd-0714307ce225/copy/0.log" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.269191 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.291511 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output\") pod \"93be2c72-5575-4d6c-adcd-0714307ce225\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.291634 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppgp\" (UniqueName: \"kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp\") pod \"93be2c72-5575-4d6c-adcd-0714307ce225\" (UID: \"93be2c72-5575-4d6c-adcd-0714307ce225\") " Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.298395 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp" (OuterVolumeSpecName: "kube-api-access-2ppgp") pod "93be2c72-5575-4d6c-adcd-0714307ce225" (UID: "93be2c72-5575-4d6c-adcd-0714307ce225"). InnerVolumeSpecName "kube-api-access-2ppgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.402711 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppgp\" (UniqueName: \"kubernetes.io/projected/93be2c72-5575-4d6c-adcd-0714307ce225-kube-api-access-2ppgp\") on node \"crc\" DevicePath \"\"" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.545029 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "93be2c72-5575-4d6c-adcd-0714307ce225" (UID: "93be2c72-5575-4d6c-adcd-0714307ce225"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:12:03 crc kubenswrapper[4895]: I1202 10:12:03.610601 4895 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93be2c72-5575-4d6c-adcd-0714307ce225-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 10:12:04 crc kubenswrapper[4895]: I1202 10:12:04.125546 4895 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldm68_must-gather-86887_93be2c72-5575-4d6c-adcd-0714307ce225/copy/0.log" Dec 02 10:12:04 crc kubenswrapper[4895]: I1202 10:12:04.126130 4895 scope.go:117] "RemoveContainer" containerID="5653142cb06c62207dc7036f57ae60397c136dc2f5827ad85d8267faa96f3beb" Dec 02 10:12:04 crc kubenswrapper[4895]: I1202 10:12:04.126254 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldm68/must-gather-86887" Dec 02 10:12:04 crc kubenswrapper[4895]: I1202 10:12:04.162855 4895 scope.go:117] "RemoveContainer" containerID="266b7ba0e11efa3164afc7a0af329cc069b5de1b34e3ce72a87a889be48bea66" Dec 02 10:12:05 crc kubenswrapper[4895]: I1202 10:12:05.157413 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" path="/var/lib/kubelet/pods/93be2c72-5575-4d6c-adcd-0714307ce225/volumes" Dec 02 10:12:06 crc kubenswrapper[4895]: I1202 10:12:06.141226 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.219522 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"f6db49592fbc5b186009bcef21ea68381392a89fe95753fb1616afee661f80ad"} Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.632050 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:07 crc kubenswrapper[4895]: E1202 10:12:07.633010 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="extract-content" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633033 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="extract-content" Dec 02 10:12:07 crc kubenswrapper[4895]: E1202 10:12:07.633062 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="registry-server" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633070 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="registry-server" Dec 02 10:12:07 crc kubenswrapper[4895]: E1202 10:12:07.633097 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="extract-utilities" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633105 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="extract-utilities" Dec 02 10:12:07 crc kubenswrapper[4895]: E1202 10:12:07.633121 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="copy" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633128 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="copy" Dec 02 10:12:07 crc kubenswrapper[4895]: E1202 10:12:07.633141 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="gather" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633148 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="gather" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633468 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="copy" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633513 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f4e1c5-a3e3-4391-be72-d2f2b908da65" containerName="registry-server" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.633526 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="93be2c72-5575-4d6c-adcd-0714307ce225" containerName="gather" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.635654 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.644100 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.691399 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsxl\" (UniqueName: \"kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.691469 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.691660 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.793368 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsxl\" (UniqueName: \"kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.793820 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.794093 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.794314 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.794618 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.812239 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsxl\" (UniqueName: \"kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl\") pod \"redhat-operators-8s7st\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:07 crc kubenswrapper[4895]: I1202 10:12:07.961006 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:08 crc kubenswrapper[4895]: I1202 10:12:08.484429 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:09 crc kubenswrapper[4895]: I1202 10:12:09.245848 4895 generic.go:334] "Generic (PLEG): container finished" podID="63885764-dfaa-4409-9274-21ba9cefbe72" containerID="06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858" exitCode=0 Dec 02 10:12:09 crc kubenswrapper[4895]: I1202 10:12:09.246029 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerDied","Data":"06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858"} Dec 02 10:12:09 crc kubenswrapper[4895]: I1202 10:12:09.246128 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerStarted","Data":"78d8f383c83d727f5b90c89f3859172b9279adb31829df92b48a50437a612f93"} Dec 02 10:12:10 crc kubenswrapper[4895]: I1202 10:12:10.257036 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerStarted","Data":"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8"} Dec 02 10:12:12 crc kubenswrapper[4895]: I1202 10:12:12.278137 4895 generic.go:334] "Generic (PLEG): container finished" podID="63885764-dfaa-4409-9274-21ba9cefbe72" containerID="b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8" exitCode=0 Dec 02 10:12:12 crc kubenswrapper[4895]: I1202 10:12:12.278199 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerDied","Data":"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8"} Dec 02 10:12:14 crc kubenswrapper[4895]: I1202 10:12:14.306765 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerStarted","Data":"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa"} Dec 02 10:12:14 crc kubenswrapper[4895]: I1202 10:12:14.366461 4895 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8s7st" podStartSLOduration=3.767141711 podStartE2EDuration="7.366439397s" podCreationTimestamp="2025-12-02 10:12:07 +0000 UTC" firstStartedPulling="2025-12-02 10:12:09.248250782 +0000 UTC m=+10140.419110395" lastFinishedPulling="2025-12-02 10:12:12.847548468 +0000 UTC m=+10144.018408081" observedRunningTime="2025-12-02 10:12:14.339098535 +0000 UTC m=+10145.509958148" watchObservedRunningTime="2025-12-02 10:12:14.366439397 +0000 UTC m=+10145.537299010" Dec 02 10:12:17 crc kubenswrapper[4895]: I1202 10:12:17.961942 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:17 crc kubenswrapper[4895]: I1202 10:12:17.965637 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:19 crc kubenswrapper[4895]: I1202 10:12:19.022261 4895 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8s7st" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="registry-server" probeResult="failure" output=< Dec 02 10:12:19 crc kubenswrapper[4895]: timeout: failed to connect service ":50051" within 1s Dec 02 10:12:19 crc kubenswrapper[4895]: > Dec 02 10:12:28 crc kubenswrapper[4895]: I1202 10:12:28.009594 4895 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:28 crc kubenswrapper[4895]: I1202 10:12:28.068210 4895 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:28 crc kubenswrapper[4895]: I1202 10:12:28.250446 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:29 crc kubenswrapper[4895]: I1202 10:12:29.464876 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8s7st" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="registry-server" containerID="cri-o://3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa" gracePeriod=2 Dec 02 10:12:29 crc kubenswrapper[4895]: I1202 10:12:29.914879 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.028314 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvsxl\" (UniqueName: \"kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl\") pod \"63885764-dfaa-4409-9274-21ba9cefbe72\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.028629 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities\") pod \"63885764-dfaa-4409-9274-21ba9cefbe72\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.028661 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content\") pod \"63885764-dfaa-4409-9274-21ba9cefbe72\" (UID: \"63885764-dfaa-4409-9274-21ba9cefbe72\") " Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.029982 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities" (OuterVolumeSpecName: "utilities") pod "63885764-dfaa-4409-9274-21ba9cefbe72" (UID: "63885764-dfaa-4409-9274-21ba9cefbe72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.033876 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl" (OuterVolumeSpecName: "kube-api-access-hvsxl") pod "63885764-dfaa-4409-9274-21ba9cefbe72" (UID: "63885764-dfaa-4409-9274-21ba9cefbe72"). InnerVolumeSpecName "kube-api-access-hvsxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.131406 4895 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.131439 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvsxl\" (UniqueName: \"kubernetes.io/projected/63885764-dfaa-4409-9274-21ba9cefbe72-kube-api-access-hvsxl\") on node \"crc\" DevicePath \"\"" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.141829 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63885764-dfaa-4409-9274-21ba9cefbe72" (UID: "63885764-dfaa-4409-9274-21ba9cefbe72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.233270 4895 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63885764-dfaa-4409-9274-21ba9cefbe72-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.477381 4895 generic.go:334] "Generic (PLEG): container finished" podID="63885764-dfaa-4409-9274-21ba9cefbe72" containerID="3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa" exitCode=0 Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.477422 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerDied","Data":"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa"} Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.477449 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s7st" event={"ID":"63885764-dfaa-4409-9274-21ba9cefbe72","Type":"ContainerDied","Data":"78d8f383c83d727f5b90c89f3859172b9279adb31829df92b48a50437a612f93"} Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.477468 4895 scope.go:117] "RemoveContainer" containerID="3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.477518 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s7st" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.499208 4895 scope.go:117] "RemoveContainer" containerID="b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.525979 4895 scope.go:117] "RemoveContainer" containerID="06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.531994 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.544382 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8s7st"] Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.574329 4895 scope.go:117] "RemoveContainer" containerID="3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa" Dec 02 10:12:30 crc kubenswrapper[4895]: E1202 10:12:30.575248 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa\": container with ID starting with 3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa not found: ID does not exist" containerID="3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.575290 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa"} err="failed to get container status \"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa\": rpc error: code = NotFound desc = could not find container \"3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa\": container with ID starting with 3922cea6334d862168118e3b0e6dfc0fca34183c094b089a6f247a9c61ba1baa not found: ID does not exist" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.575317 4895 scope.go:117] "RemoveContainer" containerID="b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8" Dec 02 10:12:30 crc kubenswrapper[4895]: E1202 10:12:30.575730 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8\": container with ID starting with b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8 not found: ID does not exist" containerID="b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.575780 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8"} err="failed to get container status \"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8\": rpc error: code = NotFound desc = could not find container \"b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8\": container with ID starting with b9d56041372bb4cd84ad3e964c5dc84f40d6455b3ce2f444a5debbb2516eced8 not found: ID does not exist" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.575804 4895 scope.go:117] "RemoveContainer" containerID="06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858" Dec 02 10:12:30 crc kubenswrapper[4895]: E1202 10:12:30.576352 4895 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858\": container with ID starting with 06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858 not found: ID does not exist" containerID="06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858" Dec 02 10:12:30 crc kubenswrapper[4895]: I1202 10:12:30.576413 4895 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858"} err="failed to get container status \"06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858\": rpc error: code = NotFound desc = could not find container \"06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858\": container with ID starting with 06e032f3dc6fb58d92d8c8072ef55f4aeed5eef8463b4b62fe474e20f9a15858 not found: ID does not exist" Dec 02 10:12:31 crc kubenswrapper[4895]: I1202 10:12:31.155664 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" path="/var/lib/kubelet/pods/63885764-dfaa-4409-9274-21ba9cefbe72/volumes" Dec 02 10:14:35 crc kubenswrapper[4895]: I1202 10:14:35.473949 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:14:35 crc kubenswrapper[4895]: I1202 10:14:35.474554 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.177216 4895 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96"] Dec 02 10:15:00 crc kubenswrapper[4895]: E1202 10:15:00.178529 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="registry-server" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.178551 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="registry-server" Dec 02 10:15:00 crc kubenswrapper[4895]: E1202 10:15:00.178608 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="extract-utilities" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.178617 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="extract-utilities" Dec 02 10:15:00 crc kubenswrapper[4895]: E1202 10:15:00.178631 4895 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="extract-content" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.178638 4895 state_mem.go:107] "Deleted CPUSet assignment" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="extract-content" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.178879 4895 memory_manager.go:354] "RemoveStaleState removing state" podUID="63885764-dfaa-4409-9274-21ba9cefbe72" containerName="registry-server" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.179829 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.182216 4895 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.182258 4895 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.189519 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96"] Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.339197 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.339375 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.339439 4895 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.441788 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.441940 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.441992 4895 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.443055 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.449338 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.461236 4895 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln\") pod \"collect-profiles-29411175-68g96\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.506355 4895 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:00 crc kubenswrapper[4895]: I1202 10:15:00.954456 4895 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96"] Dec 02 10:15:01 crc kubenswrapper[4895]: I1202 10:15:01.090570 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" event={"ID":"f575ed5f-2291-4bb2-bd40-1d71d3b25b12","Type":"ContainerStarted","Data":"550f7a7138a05f058f5c1d0b028815df8b26cca59b1e7f814374442beab1c213"} Dec 02 10:15:02 crc kubenswrapper[4895]: I1202 10:15:02.102271 4895 generic.go:334] "Generic (PLEG): container finished" podID="f575ed5f-2291-4bb2-bd40-1d71d3b25b12" containerID="21ccbf919ba7d0bd4e5a9b22a2ff89ddff48b80f0ff1755e2aa4609389856fb6" exitCode=0 Dec 02 10:15:02 crc kubenswrapper[4895]: I1202 10:15:02.102332 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" event={"ID":"f575ed5f-2291-4bb2-bd40-1d71d3b25b12","Type":"ContainerDied","Data":"21ccbf919ba7d0bd4e5a9b22a2ff89ddff48b80f0ff1755e2aa4609389856fb6"} Dec 02 10:15:03 crc kubenswrapper[4895]: I1202 10:15:03.980692 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.047289 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume\") pod \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.047407 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln\") pod \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.047458 4895 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume\") pod \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\" (UID: \"f575ed5f-2291-4bb2-bd40-1d71d3b25b12\") " Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.048884 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume" (OuterVolumeSpecName: "config-volume") pod "f575ed5f-2291-4bb2-bd40-1d71d3b25b12" (UID: "f575ed5f-2291-4bb2-bd40-1d71d3b25b12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.055445 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f575ed5f-2291-4bb2-bd40-1d71d3b25b12" (UID: "f575ed5f-2291-4bb2-bd40-1d71d3b25b12"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.056575 4895 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln" (OuterVolumeSpecName: "kube-api-access-mbkln") pod "f575ed5f-2291-4bb2-bd40-1d71d3b25b12" (UID: "f575ed5f-2291-4bb2-bd40-1d71d3b25b12"). InnerVolumeSpecName "kube-api-access-mbkln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.125543 4895 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.125593 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411175-68g96" event={"ID":"f575ed5f-2291-4bb2-bd40-1d71d3b25b12","Type":"ContainerDied","Data":"550f7a7138a05f058f5c1d0b028815df8b26cca59b1e7f814374442beab1c213"} Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.125656 4895 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="550f7a7138a05f058f5c1d0b028815df8b26cca59b1e7f814374442beab1c213" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.150423 4895 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.150467 4895 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:04 crc kubenswrapper[4895]: I1202 10:15:04.150480 4895 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbkln\" (UniqueName: \"kubernetes.io/projected/f575ed5f-2291-4bb2-bd40-1d71d3b25b12-kube-api-access-mbkln\") on node \"crc\" DevicePath \"\"" Dec 02 10:15:05 crc kubenswrapper[4895]: I1202 10:15:05.070803 4895 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm"] Dec 02 10:15:05 crc kubenswrapper[4895]: I1202 10:15:05.080221 4895 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411130-z2skm"] Dec 02 10:15:05 crc kubenswrapper[4895]: I1202 10:15:05.154779 4895 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecbb554b-7c1f-4475-83ff-8184cc72986b" path="/var/lib/kubelet/pods/ecbb554b-7c1f-4475-83ff-8184cc72986b/volumes" Dec 02 10:15:05 crc kubenswrapper[4895]: I1202 10:15:05.474576 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:15:05 crc kubenswrapper[4895]: I1202 10:15:05.474654 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:15:35 crc kubenswrapper[4895]: I1202 10:15:35.473188 4895 patch_prober.go:28] interesting pod/machine-config-daemon-wfcg7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 10:15:35 crc kubenswrapper[4895]: I1202 10:15:35.473849 4895 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 10:15:35 crc kubenswrapper[4895]: I1202 10:15:35.473907 4895 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" Dec 02 10:15:35 crc kubenswrapper[4895]: I1202 10:15:35.474802 4895 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6db49592fbc5b186009bcef21ea68381392a89fe95753fb1616afee661f80ad"} pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 10:15:35 crc kubenswrapper[4895]: I1202 10:15:35.474860 4895 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" podUID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerName="machine-config-daemon" containerID="cri-o://f6db49592fbc5b186009bcef21ea68381392a89fe95753fb1616afee661f80ad" gracePeriod=600 Dec 02 10:15:36 crc kubenswrapper[4895]: I1202 10:15:36.531767 4895 generic.go:334] "Generic (PLEG): container finished" podID="0468d2d1-a975-45a6-af9f-47adc432fab0" containerID="f6db49592fbc5b186009bcef21ea68381392a89fe95753fb1616afee661f80ad" exitCode=0 Dec 02 10:15:36 crc kubenswrapper[4895]: I1202 10:15:36.531838 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerDied","Data":"f6db49592fbc5b186009bcef21ea68381392a89fe95753fb1616afee661f80ad"} Dec 02 10:15:36 crc kubenswrapper[4895]: I1202 10:15:36.532342 4895 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wfcg7" event={"ID":"0468d2d1-a975-45a6-af9f-47adc432fab0","Type":"ContainerStarted","Data":"be60b78e7799b98fa548af1dfe906847ab8b00fb2bcf3f9bccb9361139171705"} Dec 02 10:15:36 crc kubenswrapper[4895]: I1202 10:15:36.532365 4895 scope.go:117] "RemoveContainer" containerID="cfbf6225642a8065a30b8648d17f460979c2674548915f8e4ef4e6f94f9dc5d8" Dec 02 10:15:42 crc kubenswrapper[4895]: I1202 10:15:42.711040 4895 scope.go:117] "RemoveContainer" containerID="b8b3946e05073aa58ebcb6cd2f96cb0fc31f4a49336abf33b930a0bb61f82989"